Friday, February 18, 2011

10 Technologies for 2011 : 9. Mobile Apps

Mobile Apps

Yes, Mobile Devices and related technologies seem over represented in this list of top ten technologies for 2011. Thats because mobile technology is driving massive change in the technology sector. The decades old dominance of the PC platform is being supplanted at breakneck pace by mobile devices.

The traditional model of IT groups dictating what devices could connect to corporate systems is fading away fast. New devices from Apple and the myriad vendors who support the Android platform are being brought into the workpace without any prior approval. The line between the personal use of mobile devices and corporate devices is getting blurred. The iPad is being adopted at positively astounding rates by senior management. The days of the ultralight laptop as status symbol are numbered. Showing up at meetings with an iPad is almost expected when you get closer to the top of the corporate ladder.

What does this mean for corporations that create content that traditionally was delivered by a dedicated desktop app or even a web 2.0 app delivered through a standard browser ?

It means every company needs a mobile strategy. At the top of the house, there must be commitment to considering mobile devices to be first class citizens when designing content delivery channels. It is a hard pill to swallow for many but the fact is mobile devices entering the market place now are more powerful than perhaps 75% of business desktops out there for general compute tasks, lagging perhaps only in their graphics rendering prowess.  Given that corporate desktop refresh cycles run typically 3 years, it is almost a certainty that in the next year personal and corporate mobile devices will be more powerful than desktop computers employees may have.

So what are the main elements of a mobile application strategy:  (1) Decisions around targeting a specific device family (iOS, WebOS, Android, Windows Phone 7 etc) or to build a general browser based app that is smart aboyt scaling down to mobile devices. (2) Decisions around free vs paid or a hybrid freemium business model to deliver content. (3) Decisions around online vs offline access to content. (4) Decisions around extending the corporate web brand or starting afresh (4) Training or hiring new staff for mobile device development or retaining a third party specialist firm. (5) Rapid iteration of a concept vs execution of a fully realized plan.

Some of these decisions may seem obvious to many of us, however each company's market position and preparedness is different and that must influence their decision making.

Common pitfalls include underestimating the complexity and expertise required in user interface design for mobile devices, underbudgeting for long term maintenance of the application as well as not factoring in the influence customer reviews of mobile apps that are part of every app store now.

Monday, February 14, 2011

10 Technologies for 2011 : 8. Security in the Cloud

Security in the Cloud


As we get into 2011's spend/invest/harvest IT cycle in earnest, CIO's and CTO's of most medium to large corporations are certain to be grappling with the challenges of addressing increasing use of Cloud based infrastructure to accomplish Enterprise goals. It is not unusual in these Cloud obsessed times for Senior IT management to be presented with a hobsons choice: They can choose every which way of implementing large systems as long as they choose Cloud. Since this trend is only accelerating, we can choose to embrace it and be prepared or ignore it (at great personal and corporate cost).

One of the most pressing challenges IT faces in dealing with Cloud based infrastructures is Security.

When looking at Cloud providers, it is helpful to break down the Security domain into groupings:


Cloud Data Center

Ultimately, Cloud providers maintain their own( Salesforce.com, Microsoft) or lease one from ISPs or very large infrastructure providers like Google, Microsoft or Amazon.  Since all data centers are not created equally, it is quite helpful to turn to third party standards to benchmark how these compare against each other and against generally accepted best practices.  

The key tools to use accomplish in this area include SAS 70 Audits, which are an industry standard way of looking at physical (such as access cards, biometrics at the data center), network security (perimeter and host based) as well as access control etc. Just being having a SAS 70 Audit done signals some respectability among providers. These are generally done by well known audit firms. One step further is getting access to any findings generated by the SAS 70 audit. These are rarely shared given the sensitive nature of the findings and may come into play only when large contracts are involved. 

Additional tools to verify the integrity of the data center and its basic architecture is to hire or access third party Vulnerability and Penetration Testing reports. These involve controlled probing of the network and host infrastructure to find weaknesses  by a third party that specializes in this technology and may employ specialized "White Hat" hackers to execute their tests. Much like the SAS 70 situation, having these done at a data center signals confidence, having access to specific reports makes it far more useful but are very hard to get access to for obvious reasons.


Data in Transit

As Data traverses the corporate networks as well as the Internet, it needs to be kept secure.  Most often this is accomplished through point to point HTTPS(SSL) connections. It is worthwhile to check if the protocols to be used for communicating between your corporate networks and the Cloud provider use SSL connections. Occasionally the provider is on the cusp of widespread adoption and has little experience with carrying sensitive data and may be using the HTTP protocol and may therefore intend to send your data over clear text. Hopefully as the decade advances these instances will be far and few in between. However it doesnt hurt to be careful and ask before contracts are signed. 

A new generation of Cloud providers now allow VPN directly into their Cloud Infrastructures, creating secure tunnels over the internet dispensing with the need for secure protocols in the application itself. 



Data at Rest

While HTTPS covers transport of data and for most providers will be easy to comply with, some of the more thorny issues involve what happens to your data as it sits in the Providers cloud databases or file systems.

Your company may be one that deals with sensitive data that must not escape your premises or you may be simply not willing to take the risk with data in the cloud. In such cases, you can attempt to screen sensitive data by means of automated search and replace mechanisms that for example can look for known patterns of data in Social Security Numbers, Dates of Birth, Telephone numbers etc and replace them with filler characters to signify the data was obfuscated.

For large enterprises one of the most difficult issues to deal with Cloud providers is Multi-Tenancy. For reasons of economical scale, it is very likely that your provider is hosting your data along with several of your potential or real competitors on the same infrastructure.  It is then critical that the risk that this represents is mitigated through well thought of approaches. Some of the more common strategies to deal with this include: (i) Encrypting data fields with strong encryption, so that only your application knows how to decrypt it for consumption. As it sits in the database it is unreadable to a Systems Administrator of the Cloud Provider. This is a good solution except that it consumes resources and will slow your application down if you over use them. (ii) Very granular access control, locking down access to fields based on roles.

Once you work your way through the data encryption and access control policies, you still have to work on resolving questions of (i) Who owns the data (usually the client corporation, very rarely the Cloud provider (ii) How will the data be backed up and made available outside the Cloud (in case your Cloud provider ceases operation or has a catastrophic crash (iii) What happens to the data after your contract terminates with the provider (usually data and data backups are verifiably erased )


Monitoring

The best Cloud providers already have well regarded logging and monitoring suites, but your corporation may choose to use a centralized logging or SIEM strategy and have the security events such as logins, record deletions etc logged to a central platform on your premise or in the cloud to act as a central resource for forensics when things go wrong.

It is also vitally important to have access to incident logs generated by the Cloud provider that are specific to your corporation.


Regulations

Outside of the security controls you will put into place as part of your strategy of securing data for your Enterprise, you may have to comply with a variety of Regulatory regimes depending on where you do business and what kind of data you process. 

When doing business with European countries, having Safe Harbor compliance signals general compliance with the principles of Data Privacy law in the EU. Safe Harbor is part of the framework agreed to by the EU and the US Department of Commerce to harmonize the differing data privacy regimes in their respective geographies. 

PCI is a standard adopted among financial services company with particular focus on those who process credit card transactions on behalf of their customers. 

Finally, HIPAA, is a Data Privacy law that covers medical records in the US that is broad in its scope and reach. Companies that deal with other Healthcare companies may be forced to comply with its requirements.


Conclusion

As you can see, Security in the Cloud is less a specific technology and more a family of technologies, policies and strategic choices. However, there is no doubt whatsoever that it is one of the critical areas for IT investment in 2011.



Saturday, February 12, 2011

10 Technologies for 2011 : 7. HD Teleconferencing

HD Teleconferencing


This one is not usually considered a core IT function in many large corporations and is often in the domain of facilities management along with electrical and data wiring. Thats only because the revolution in video conferencing has been so rapid. IT has not caught up. Yet.

5-8 Years ago, the state of the art in video conferencing would be small rooms filled with a couple of rows of seats and a large screen with a smart board and a few mics in the ceiling. The pictures would be grainy and the refresh rates too slow to convey any non-verbal cues. The audio and video feeds would frequently get out of sync. Often one came away from a video conference thinking it was not worth the trouble and the participants would have been better off just using a dial-in number. After all the first 10 minutes would be consumed by setup related issues.

Things have definitely changed. Invention is often spurred by the right circumstances. The circumstances were certainly very conducive to development of hi fidelity video conferencing. Several factors came together at the same time: (1) In the post 9-11 world, the prospect of travel with the long wait times in airport security lines and ever diminishing amenities in the air meant executives were far less likely to want to travel. (2) In the recession that followed right after 9-11 and the dot.com bust corporate finance departments suddenly became very aggressive in their efforts to control costs and sought every opportunity to minimize expenses including travel expenses (3) Gas prices rose very quickly during the last decade (4) Communication bandwidth increasingly became a commodity and prices fell very quickly. (5) network equipment capable of supportinbg true Quality of Service (QoS) became more easily available and indeed commonplace. (6) HD became mainstream as a video format with industry wide availability

All of these favorable conditions created the right environment to foster rapid innovation in the videoconferencing world.  Some of the early leaders in the HD Video Conferencing world included Tandberg, Radvision and Polycom.  Cisco has since acquired Tandberg and markets its products under the Cisco Telepresence line.

Although the initial costs of setting up a high end HD conference are quite high (likely to be in the 6 figures for a medium sized room) the cumulative savings in travel expenses can create ROI that are quite attractive.

The HD video stream along with very low initiation times and advanced camera, mic and speaker placement that come together in a telepresence suite goes a long way toward creating an environment where the participants forget they are in a video conference and not face to face for the duration of the call.

2011 is a key transitional year for this technology in large corporations. With the improving economy, financial leaders at these companies are more likely to greenlight capital expenses this year, allowing legacy ISDN based videoconferencing setups to transition into full telepresence suites.