And so we are seeing a change in project timelines. Specifically, in larger orgnaiztions,
And so this can be somewhat frustrating.
This is exactly where we were this spring. We had created a suite of application for the US Forest Service, and wanted to launch the Public facing application.
Since there was no option to run ArcGIS Server pre-release on USDA servers, we talked with Esri and were able to join the EC2 beta program.
Of course today, it’s a little easier – if you have an EDN license, you can use ArcGIS Server on EC2 for development and testing.
First off, there is no free lunch here, so grab your credit card…
Create an Amazon Web Services account,
Once you’ve got your AWS account, the next step is to get in touch with Esri and ask for “Amys”
Once you’ve got an account, you need access to the Amazon Machine Images. These are pre-configured machines just waiting to be called into action. Sitting on a shelf if you will.
http://www.flickr.com/photos/yakinik/4625973891
In our office, we have 3 physical servers involved…
But when we move to the cloud, we need to think about what this is gonna cost on a montly basis…
A Larger Instance for Production will run ~350/mo
But if that’s too rich for your blood,
You can run a small instance for dev/demo
Work hours only ~20/mo
Given that we were doing a proof of concept for our client, AND we were beta testing, we opted to go cheap, and run everything in a single large instance
With that behind us it was time to fire things up, which is super easy… Launch Instance… get a coffee, and…
Esri has come convenient video walk-thrus for this, so we’ll skip this part. Suffice to say it takes about 10 minutes.
And then you can RDP to the server.
http://www.flickr.com/photos/taniaedu/1554390505
Our application talks to 3 types of services – a tile service hosted by Esri, a Feature service for the interative state/county boundaries and a custom data service running in ASP.NET MVC.
Since everything was service based and configurable, the no changes were required to the application tier. In fact, the first thing we did was push the application across and validate that it still worked – only hitch was that it was still hitting our servers for data! ;-)
Our original data storage design was another story.
Since the plan was to deploy this at a USDA data center, keeping both the spatial and tabular data in the same database is easiest. Simple enough in SQL Server…
It’s just a little “messy” shall we say.
But we don’t have ArcSDE, and more specifically we don’t even have the option ArcSDE on SQL Server over in EC2. So time to rip things apart.
Ok, whew! Using ArcCatalog, this is about a 30 second operation, which is good because that leaves us a little time to talk about
Performance! Specifically how we can get our app to be faster.
ArcGIS Server does a lot of different “things” on the back end, but if you boil it down it serves two things: Images and Json
Essentially “generalization” is the process of removing nodes from a line, while still retaining the character of it’s shape. Most commonly done using the Douglas Peucker algorithm but
Also done via a simple tool in ArcGIS. Ok Why?
Data on the wire as Json. This essentailly made it possible to have deal with the boundary layers as features. Otherwise it was taking more than a minute to load and parse just the States!
But that’s not the only benefit – by having fewer line segments, we see substantial decreases in rendering time for images
Back to our tale…
http://www.flickr.com/photos/craigmdennis/3557378176
Since the counties effectively don’t change, but the additional pest records are added annually, the best option is to set this up as a spatial view in ArcSDE,
and have the application set definition queries to just show specific data… But, we don’t have ArcSDE… so we bit the bullet and de-normalize the data.
Again, using ArcCatalog. All well and good but now we have a feature class with more than 100,000 features in it.
Custom data service running in ASP.NET MVC.
This is the tabular data that was stored along side the spatial data. Since we are storing this in SQL Express in EC2, we need to
e wanted to keep the database as small as possible so we “extracted” the tabular data into it’s own SQL database.
So at this point we’ve changed up our data so we have a File Geodatabase, and a SQL database. We zipped these files up and we’re ready to roll.
http://www.flickr.com/photos/mujitra/2134721435
Once we have the files up on the server, I just used ArcCatalog to publish the map services
Your instance is alive! But before you can play with it there are a few more things to do…
http://www.flickr.com/photos/49488791@N03/4538548843
It’s very easy to get started – between the documentation provided by Esri and Amazon (which you have to read!) it’s very straight forward
If you are building a new application, deciding on the cloud deployment environment early will help avoid the data juggling we had to go through.
Specifically – File Geodatabase is your Friend.
Assuming you have an EDN license ;-)
First, get approval to spend some money… and setup an Amazon AWS account
http://www.flickr.com/photos/mujitra/2527174367
Pick the size and types of instances you want
http://www.flickr.com/photos/yakinik/4625973891
Light it up and do some house keeping
Pack up your data and application code…
http://www.flickr.com/photos/mujitra/2192326482
Re-Deploy map services
And bask in the glow of cloud awesomeness!
http://www.flickr.com/photos/seannaber/4043266651