Wednesday, December 16, 2009

ASP Vs SaaS-Old wine in New Bottle

Back in the dot-com boom the concept of Application Service Provider (ASP) was introduced as a new breed of software delivery. ASPs got the application outsourcing business rolling by hosting third-party, client-server or early web applications. This was a new delivery model where till then only in-house applications were known to the industry. ASP took the full advantage of internet: - to be miles away from the end user. Applications are hosted somewhere in the internet and managed remotely. The users use it through the web forms. The enterprises need not acquire or maintain the software they can use it for a subscription fee. This model also got burst along with the dot-com bubble. Though it was a revolutionary business model the lack of maturity of tools and standards were the real villains in this story. In other words the idea was a forerunner of the technology. Now we are again started talking about Software as a Service but with a basic difference, this time the foundation is stronger. Broadband connections are faster and cheaper now. Also the Webservices and Web 2 standards are more matured and industry proven. These improvements helped to host applications which are on par with owned applications in terms of usability and speed. Also the other ‘*aaS‘models helped to achieve elasticity which in turn ended up in effective usage of available resources. The emergence of multi-geo organizations also accelerated this paradigm shift.

Now let us discuss how they differ in their outlook. Despite the fact that the medium is same (Internet) ASP is targeting the delivery, where as the SaaS is focusing more on the service aspect of the product. In ASP the provider is someone like a middleman who packages and host third party applications in a data centre for the use of end customer. The original application may not be designed for hosting it. But the new breed, SaaS, is specialized and designed specific to be used as a hosted service. They are generic services as opposed to customer specific bulky applications in the ASP era. Since ASP applications were specific the maintenance cost was high. Another visible difference is in the billing granularity. ASP had a model on per server or per user. But SaaS can bill by CPU cycles, bytes transferred. This facilitates scaling a system

ASP was not proved to be a grant success though the concept was novel where as SaaS is getting momentum in the industry. Let us see what changes the mindset of the decision makers to embrace software services. I could see both financial and technological reasons behind this shift. The turbulence in the global economy is one of the compelling reasons for the CIOs to use a rented service to run the show. Another reason is the ever-changing technology and continues business optimizations. Every enterprise has to rapidly implement new business ideas to survive in the market. To build a quick solution from the limited existing IT resources are near to impossible. So the solution is to rent a service. All of these developments, coupled with an IT outsourcing habit , caused many CIOs to relinquish their company's IT assets to a SaaS, which was not the case a couple of decades ago.

Though the objectives and technology stacks differ, I still like to believe SaaS is a successor of ASP with technological and business model improvements. I think at least conceptually both give similar outcome to the end user though the usage pattern may differ. I like to see more perceptions on this.

Friday, July 10, 2009

SPML in a cloud view

Security standards always fascinated me and by no wonder the one for Cloud computing. Though I believe the future computing is through cloud, a bit of skeptic on their security mechanisms in place. The significance of a standard way of provisioning is the first and foremost security measure that delights a cloud user especially in corporate sector. Security Provisioning Markup Language, a standard derived by OASIS, is for exchanging user information, resource information, and service provisioning information in systems. This is my first hand feelings on the usage of SMPL in cloud though not an expert opinion.

Let me start by asking the question: What is provisioning?
As per the OASIS Provisioning Service Technical committee, provisioning is the automation of all the steps required to manage (setup, amend and revoke) user or system access entitlements or data relative to electronically published services. Before we get into the details of provisioning let us take a scenario of an employee joining a company. In most of the modern enterprises he or she will be greeted with a set of docos and followed by a pc or laptop. Now the hard work of HR starts to setup the working environment for the employee. It starts with getting the credentials and mail account. Apart from that based on the role he may need access to various business applications in the enterprise. The earlier the better!
So now our HR executive is busy in a series of calls accompanied by emails to the IT admin saying that we have a new joiner and need to set up accounts and get him the pc. IT Service requires a set of details like last name, SSN etc to create the account and add it. Arguably in some of the big enterprises this may be automated as part of a workflow process. This is sufficient in case of an in-house IT set up. But let us consider an enterprise with services spread across cloud. Each service may be hosted by different cloud provider. This makes the situation of our HR reps really complex. He needs to ensure that all are well. The situation is more catastrophic in the case if an employee resigns. It is really important that this user needs to de-provision (Who know there is a word deprovision!) the next minute he left the organisation otherwise the organisations assets could be in danger. So we need a standard based automatic provisioning system in place where we need to live in a heterogeneous IT eco system. Here comes the importance of SPML. It provides standards for securely communicate provisioning details between various applications/services.

In SPML theory a provisioning system contains three essential components: a Requesting Authority (RA), a Provisioning Service Provider (PSP), and a Provisioning Service Target (PST).

Requesting Authority (RA): In a typical provisioning system the RA is the client. Well-formed SPML documents are created by the RA and are sent to the SPML service point. These requests describe an operation to be performed at the PSP end.
Provisioning Service Point (PSP): A component that listens and processes well-formed SPML documents is called a Provisioning Service Point.
Provisioning Service Target (PST): The Target is basically actual software or an application on which action is taken.

Though the standard SPML was there a few years back (So laszy to check the exact year!), the importance is augmented by the dawn of Cloud computing.



Monday, June 15, 2009

Cloud on Opensource

Nowadays cloud computing is a buzz word and becoming a popular model of IT service. Everyone talks about the benefits and the business agility of the enterprises those uses the services offered through cloud computing as opposed to the conventional in-house hosted applications. Generally we term the three famous ‘aaS’s (IaaS, PaaS and SaaS) collectively as cloud computing. The business model of all these three works in a similar manner. The user gets charged either based on their usage or based on monthly subscription. In either case the user need not bare the purchase and maintenance cost of the IT asset through which they enjoy the required service. Most of the analysts and architects believe that this model encourages the enterprises to adopt business changes faster and help to improve their business processes by leveraging these new Services. As we can see, the main attraction of this model is near to zero initial cost and ability to scale as required. Though it is in the early stage of evolution, most of the analysts are unanimously voting it as the next generation of IT service
All these days we have another strong business model named Open source model which evangelize the freedom to replicate and scale with out additional cost. Open source product vendors charge the users for their maintenance service and technical help rather than the license fee. Open source philosophy stand for the freedom of the user to use, modify and distribute his favorite software with out any copy right issue. This is indeed a good business model for the customers as they need not pay extra for initial setup and distributing it to other machines. More importantly there is no vender locking.
In the initial days most of the programs are single user based and hence a program implicitly means the executable as well as the data used by the program. But there is a paradigm shift once the networked services appeared in the horizon. A networked service, whether it could be a simple web application or a complex ERP process, the program runs on the server and the users ‘use’ the software through the permissible interfaces, most commonly a web browser.
The introduction of cloud computing increases this use and pay model. People like Richard Stallman and some of the open source philosophers went to the extreme of labeling the cloud computing as a sin and protest it for trapping the user to a vendor lock. Few other groups like O’Riley believe that this is a natural end of open source model. Open source is about enabling innovation and re-use, and at their best, cloud computing can be bent to serve those same aims. Though we may not be able predict the future, it is interesting to see if there is any common space where both these models can complement each other and converge for a better user experience. As we saw earlier, most of the cloud computing implementations are of initial stage and still lags lot of features and standards that hinder the enterprises to adopt this model. Most users worry about the safety of their critical data in the cloud environment. What happen to my data if the provider shut shop and run away? What happen to the program if the platform and/or the framework get changed? How can I change the provider if I am unhappy with the current one? I think we can answer these questions by applying the same open source philosophy.
First and foremost the adoption of open source platform stacks in a cloud computing implementation. This will not only allows to replicate the platform but also reduces the overall cost and hence the user fee. Google app engine is an example for this. They provides java and python based application framework for the users to develop their application and deploy on the cloud. Another available open source framework that helps to create cloud environment is EUCALYPTUS (Elastic Utility Computing Architecture for Linking Your Programs to Useful Systems). The current interface to EUCALYPTUS is compatible with Amazon's EC2 interface, but the infrastructure is designed to support multiple client-side interfaces. EUCALYPTUS is implemented using commonly-available Linux tools and basic Web-service technologies making it easy to install and maintain. This approach of abstracting and providing open interfaces to the user helps to have hassle free movement between the providers.
Ensuring the use of the AGPL license (GNU AFFERO GENERAL PUBLIC LICENSE) which is designed specific to networked service ensures that a user of a particular service is allowed to get the source code of the software in a publicly assessable server. This reduces the fear of provider closing the shutters. Even in this case we can get the code and host the service somewhere else. But the data and other collaborating users/services still remain as problem. The main problem with data is the format which they store for the use of the program. By providing API, tools and open standard to retrieve data could minimize this issue. This is known as open knowledge. Such services can be classified as open software service. The definition of open knowledge is available at
http://opendefinition.org/1.0. The open micro blogging site “http://identi.ca” has almost achieved the openness in code as well as in the knowledge. You can freely download the data and code from their server and set up your own service in case of requirement. Also it uses the open standards like open micro blogging protocol (http://openmicroblogging.org/) for and open ID for authentication. So it is easier to collaborate with other communities and avoids any vendor locking. The non profitable organization named Open cloud consortium (OCC) is a big step towards making open standards and frameworks for cloud computing.
We can see that cloud computing has borrowed from open source in terms of its governing principles, which could well be open source's lasting contribution to the cloud.