Living in the Cloud: My Personal Cloud Setup

Altocumulus cloud, Tel-Aviv, Israel.

The task of setting up my new iPad and iPhone recently was an eye-opener to the dependency I now have on cloud services. I expected it to take a few days to totally get all my stuff (email, contacts, calendar, photos etc) on the devices, however, it was a pleasant surprise to get it done in less than a few hours.

No, the credit does not go to the devices…it goes to the cloud! This is what my current setup looks like for the various personalized services and how I keep the iDevices and other laptops in sync.

Email Accounts: Gmail …setup using Google Sync on mobile devices. Work email is setup using IMAP which is auto configured via a custom profile. Thunderbird IMAP setup on the laptop.

Calendars: Google Calendar …setup using Google Sync on mobile devices. Work calendar is set up using CalDAV which is auto configured via a custom profile. Thunderbird Lightning CalDAV setup on the laptop.

Contacts: Google Contacts setup using Google Sync on mobile devices. Work contacts directory are available via a special VPN based app. Thunderbird SyncML extension Zindus setup on the laptop.

Passwords and Personal Data: KeePass …the encrypted password database is shared via DropBox and accessed via HTTP using the MyKeePass app on the iDevices.

Chat: Skype, Yahoo! Messenger…iMO app connects to all the common messaging platforms. Work IM is configured using the Oracle Beehive app provisioned via a custom profile. Pidgin IM setup on the laptop.

My Documents: Dropbox …although the native app is quite good, what’s even better is the GoodReader app for the iPhone/iPad that can sync with Dropbox as well as email attachments, Google Docs and WebDAV servers. Secured TrueCrypt containers also synchronize well with Dropbox, however I haven’t found any mobile app that can securely browse and access those documents. Password-protected zip files (with AES encryption) have file names exposed and I have still not found any iPad app to open encrypted 7z files. Please comment if you have found any way around this. Update 07/29: iUnarchive app on the iPhone/iPad does open password protected zip files.

Not My Documents: Google Docs …this is stuff I want to read/keep but not use up my Dropbox storage and I could afford to lose e.g. pdf articles, free ebooks etc. Again, the GoodReader app does a great job browsing and downloading content from Google Docs. What could be made better is a sync feature as well as ability to view documents without downloading. Update 07/29: Using iUnarchive, I also sync the eBooks on the server with the built-in iBooks app.

Browser Bookmarks: Firefox Sync …using Firefox Home app on the mobile devices. Tighter integration with Safari or a native FF browser would have been ideal.

Pictures:  Picasaweb …browse and upload using Piconhand app or show slideshow using the iShowPhoto HD app (both free). Picasa setup on the laptop.

Videos:  YouTube and Picasaweb …native YouTube app. Upload using Picasa on the laptop.

Music: Pandora, ShoutCast and various other streaming sites and apps. For purchased/ripped content, sync via iTunes library (eagerly awaiting wireless sync in iOS5).

TV/Movies:  Netflix, Hulu, YouTube (yes, I am a CordCutter) along with iTunes Home Sharing, along with YuppTV for international news and music.

Telephony:  Google Voice, Skype …enough has been said of them.

News: Flipboard, Feedly …the apps use Google Reader RSS subscriptions as well as Twitter lists.

Thanks to the cloud, personalizing and setting up any new connected device or laptop now only requires a few settings and app installation. Now that’s what I call magical.

Advertisements

5 Performance Testing Considerations for Application Integrations

Performance Testing

Image via Flickr

Enterprise integrations are complex, both functionally, due to implementation of a business process; and technically, due to introduction of one or more runtime layers between applications. Since these integrations typically represent end-to-end business flows, developers need to ensure that the performance meets the business need.

Here are some considerations when planning for performance testing of service oriented architecture (SOA) projects that integrate enterprise applications, such as Oracle’s Application Integration Architecture (AIA).

Update April 21, 2011: AIA specific tuning details can be found in Chapter 28 of the Developer’s Guide for AIA 11gR1 (E17364-02).

1. Define the End Goal. Clearly.

It may sound obvious, but it is the main cause of performance testing efforts going awry – lack of a clear end goal.

Note: “make it run faster” does not count as a clear goal!

Quantify desired metrics in an objective manner by setting Key Performance Indicators (KPI). Here are some KPIs you may want to check for:

  • Throughput of the end-to-end business flow by users, payload size, volume
  • Response Time for the end-to-end business flow by users, payload size, volume
  • Throughput of integration layer only (legacy application interactions stubbed out)
  • Response of integration layer only (legacy application interactions stubbed out)

2. Use Metrics Relevant to the Business

System performance KPI should be derived from business metrics so that it involves both business and IT. This results in a more realistic goal than arbitrary benchmarks set by developers or vendors. For example, the throughput KPI could be derived based on a formula that uses software cost and peak order volume to result in a “minimum orders per CPU core per minute” indicator that satisfies the business needs.

When looking at transactions, always consider “peak” spikes vs the average. For example, orders coming in usually have peak periods (e.g. holiday season sales), wherein the system will be subject to transaction load that is a magnitude higher than on non-peak times. Defining KPIs based on peak transaction volumes will not only help in setting realistic goals, but ensures true success of the project when it actually handles the load when it is most needed by the business.

Finally, don’t try to boil the ocean – identify a subset of the integration use cases which are prone to performance bottlenecks and meet all the KPIs before attempting other ones.

3. Do you REALLY Need Production Grade Hardware for Testing?

Using dedicated hardware is always better than sharing existing development or QA environments. However, every business has different needs with their enterprise applications and even this changes by business process. For example, an order-to-cash process may have a need for consistently high target performance metrics with medium-high load; as compared to the financial close process, which may need it once every quarter with high load.

Instead of buying or configuring hardware that necessarily matches every possible target scenario, consider the use of commodity hardware with matching “normalized” KPIs that are downsized from the target business scenario. For example, say the production hardware uses a given compute unit (CPU/memory/cache specification); and the commodity hardware is determined to be one-fourth the compute unit. If the business KPI target is 40 orders/CPU core/minute on the production grade hardware, then the internal, normalized KPI would be one-fourth of that i.e. performance testing would need to achieve 10 orders/CPU core/minute on the commodity hardware to be considered successful.

Of course, the benchmark may not scale as linearly, but this can be easily factored into the equation, providing a good educated estimate of the integration performance. Compared to the alternative of not testing due to hardware unavailability and discovering issues in production, use of commodity hardware and normalized KPIs can be a very viable performance testing approach.

4. Choose a Consistent Testing Strategy

For integration scenarios, a bottoms up testing strategy may be useful to consider, i.e. optimize a single use case fully (to reach desired KPIs) before introducing additional artifacts or flows.

Plan on the sequencing of the use cases appropriately, which can save some cycles e.g. between a Query and an Insert use case, the Query may look simpler, but it needs data which can anyway be seeded by the Insert use case, so it may make sense to proceed with Insert first. Also, identify the “data profiles” for the use cases and create representative sample data e.g. B2B orders may have 50-100 lines per order whereas B2C orders may only have 4-5 lines/order.

For each use case, once KPIs are met with for a particular number of users, payload size etc., run longevity tests for at least 24 hours to ensure that the flow does not have memory leaks or other issues. Check the desired metrics e.g. JVM garbage collection, database AWR reports etc. and purge data after each run to ensure consistency between tests.

When the above passes, gradually increase number of users and increase payload on the same use case to identify system limitations when under load. Once the specific use case is optimized to KPI for concurrent users / payload, add new flows to the mix and tune.

While the above may again seem obvious, the temptation to “switch gears” when one use case is not fully working can cause a lot of overhead in switching context by the project teams and setting up data for the new use case etc. It is better to complete one full use case successfully before targeting others.

5. What about Standalone Testing for Integrations?

Standalone testing – stubbing out enterprise applications – is useful strategy to identify integration hotspots and remove the unknowns of the enterprise application performance from the integration scenario. However, be aware that it will not identify all performance issues. Developing stubs requires substantial investment to emulate the edge applications and may be non-trivial for enterprise applications that typically have complex setups. Furthermore, some integration settings on the SOA server will typically change when the applications are introduced, so avoid over-tuning the solution when performing standalone integration testing.

Performance testing and tuning is still somewhat of an art that requires a good understanding of the technologies, its limitations, and all the available tuning “knobs” in each technology to achieve the KPI requirements of the integration flow. At the same time, the non-technical, project related aspects of the testing exercise is also essential to the success of the initiative as a whole.