9. Five Star Open Data
1.make your stuff available on the Web (whatever format)
under an open license.
2.make it available as structured data (e.g., Excel instead of
image scan of a table)
3.use non-proprietary formats (e.g., CSV instead of Excel)
4.use URIs to denote things, so that people can point at your
stuff.
5.link your data to other data to provide context.
5stardata.info by Tim Berners-Lee, the inventor of the Web.
10. Open Data
● Data.One
– Lead by OGCIO of Hong Kong Government.
– Use the term “public sector information” (PSI)
insteads of “open data”.
– Many data are not available in machine-readable
format with useful data structure.
– A lot of data are still requiring web scraping with
customized data extraction to collect useful
machine-readable data.
12. Web Scraping
a computer software
technique of extracting
information from websites.
(Wikipedia)
13. Scrapy
● Python.
● Open source web scraping framework.
● Scrap websites and extract structured data.
● From data mining to monitoring and
automated testing.
14. Scrapy
● Define your own data structures.
● Write spiders to extract data.
● Built-in XPath selectors to extracting data.
● Built-in JSON, CSV, XML output.
● Interactive shell console, telnet console,
logging......
15. scrapyd
● Scrapy web service daemon.
● pip install scrapyd
● Web API with simple Web UI:
– http://localhost:6800
● Web API Documentation:
– http://scrapyd.readthedocs.org/en/latest/api.html
19. Creating Scrapy Project
● Define your data structure
● Write your first spider
– Test with scrapy shell console
● Output / Store collected data
– Output with built-in supported formats
– Store to database / object store.
20. Define your data structure
items.py
class Hk0WeatherItem(Item):
reporttime = Field()
station = Field()
temperture = Field()
humidity = Field()
21. Write your first spider
● Import a Class of your own data structure.
– $ scrapy genspider -t basic <YOUR SPIDER NAME>
<DOMAIN>
– $ scrapy list
● Import any scrapy class which you required.
– eg. Spider, XPath Selector
● Extend parse() function of a Spider class.
●
Test with scrapy shell console
– $ scrapy shell <URL>
22. Output / Store collected data
● Use built-in JSON, CSV, XML output at
command line.
– $ scrapy crawl <Spider Name> -t json -o <Output
File>
● Pipelines.py
– Import a Class of your own data structure.
– Extend process_item() function.
– Add to ITEM_PIPELINES at settings.
28. Create django app
● Define your own data model.
● Define and activate your admin UI.
● Furthermore:
– Define your data views.
– Addi URL routers to connect with data views.
29. Define django data model
● Define at models.py.
● Import django data model base class.
● Define your own data model class.
● Create database table(s).
– $ python manage.py syncdb
30. Define django data model
class WeatherData(models.Model):
reporttime = models.DateTimeField()
station = models.CharField(max_length=3)
temperture = models.FloatField(null=True,
blank=True)
humidity = models.IntegerField(null=True,
blank=True)
31. Define django data model
● admin.py
– Import admin class
– Import your own data model class.
– Extend admin class for your data model.
– Register admin class
● with admin.site.register() function.
32. Define django data model
class WeatherDataAdmin(admin.ModelAdmin):
list_display = ('reporttime', 'station',
'temperture', 'humidity', 'windspeed')
list_filter = ['station']
admin.site.register(WeatherData,
WeatherDataAdmin)
33. Enable django admin ui
● Adding to INSTALLED_APPS at settings.py
– django.contrib.admin
● Adding URL router at urls.py
– $ python manage.py runserver
● Access admin UI
– http://127.0.0.1:8000/admin
35. Scrapy + Django
● Define django environment at scrapy settings.
– Load django configuration.
● Use Scrapy DjangoItem class
– Insteads of Item and Field class
– Define which django data model should be linked
with.
● Query and insert data at scrapy pipelines.
37. hk0weather
● Weather Data Project.
– https://github.com/sammyfung/hk0weather
– convert weather information to JSON data from
HKO webpages.
– python + scrapy + django
38. hk0weather
● Hong Kong Weather Data.
– 20+ HKO weather stations in Hong Kong.
– Regional weather data.
– Rainfall data.
– Weather forecast report.
39. hk0weather
● Setup and activate a python virtual enviornment,
and install scrapy and django with pip.
● Clone hk0weather from GitHub
– $ git clone https://github.com/sammyfung/hk0weather.git
● Setup database connection at Django and create
database, tables and first django user.
● Scrap regional weather data
– $ scrapy crawl regionalwx -t json -o regional.json