Beautiful Soup HTML parsing The following Python code fetches the specific windspeed web page and extracts the timestamp, average windspeed, direction, gust speed and writes out data to a date stamped file named say /home/user/wind_data/windspeed_date(2015-04-21-12).txt. Schedule a cron job to run this every day at midnight say. The windspeed file can be selected for a particular day and processed by graph.py. #!/usr/bin/python import os import requests import time from bs4 import BeautifulSoup date_stamp = time.strftime('%Y-%m-%d-%H',(time.localtime(time.time()))) outfile = os.path.join(os.path.expanduser('~'), 'wind_data', "windspeed_%s.txt"%date_stamp) f = open(outfile,'w') list = [] r = requests.get("http://xxxxx.wwww.yyyyy") soup = BeautifulSoup(r.content) table = soup.find("table", {"id":"grid"}) for line in table.findAll('tr'): for l in line.findAll('td'):
Get the VirtualBox CentOS 5 as a 7z file from: http://virtualboxes.org/images/centos/ Uncompress and save the Centos64.vbox and Centos64.vdi files. Open the Oracle VM VirtualBox manager and select the vbox file to install Make > 20GB disc or there will not be room for logger. Login as root/reverse Install Gnome Desktop as follows and start: yum groupinstall "X Window System" "GNOME Desktop Environment" login root/reverse and startx Check version of CentOS and other prelim: cat /etc/redhat-release #CentOS release 5.9 (Final) uname -a # somewhere x86_64 create user logger open port 443 check logger bin file execute box and double click. run in terminal If not enough space to install logger: C:\Program Files\Oracle\VirtualBox\VBoxManage.exe modifyhd "D:\virtual machines\Centos\centos64.vdi" --resize 20000 shutdown centos VM attach gparted-live-0.16.2-1b-i486.iso to CD drive resize sda up to increased size start Centos VM and