Saturday, March 20, 2010

Business Process Management (BPM)


“BPM is a structured approach employing methods, policies, metrics, management practices, and software tools to manage and continuously optimize an organization's activities and processes”
-- Gartner Research

Business Process Management (BPM) can be defined as the practice of improving the efficiency and effectiveness of any organization by automating the organization's business processes [In general, a process for a task comprises a sequence of steps that should be followed to execute that task]. Business Process Management used to be referred to as Business Process Reengineering in the past.

Some the general goals that every organization aims at are:

(i) Better Customer Service
(ii) More sales channels
(iii) Online services
(iv) Better efficiency

Besides, the ever-changing business scenario demands for higher levels of quality, optimization of cost, on time delivery, rapid adaptability, identifying productivity bottlenecks, risk mitigation and risk control. BPM provides solutions to all these demands. Some of the benefits that BPM will provide include:

(i) Reduces risk in business processes
(ii) Consistent quality output
(iii) Increased Return on Investment
(iv) Wider range of participation in process
(v) Drives process improvement
(vi) Simplified Training

The central aim of BPM is to align the organization with the customers’ wants and needs. BPM attempts to continuously improve business processes and achieves process optimization by defining, measuring, and improving business processes. The concepts of BPM has evolved from operation transformation and enables flexible design, deployment, monitoring and tracking, process focus, and efficiency.

Each organization will have business processes that are unique to its business model. These processes will evolve over time as the business reacts to market conditions. Therefore, the BPM software tool in use at the organization must be easily adaptable to the new conditions and requirements and continue to be a perfect fit for the organization. An effective use of BPM demands that organizations stop focusing exclusively on data and data management, and adopt a process-oriented approach that blends machine and manual operations.

The concepts of KAIZEN are often used for business process improvements. In general, KAIZEN can be explained as:
KAI = Change
ZEN = Good
Changing for improvement; Changing to become better.

According Wikipedia, “A closer definition of the Japanese usage of KAIZEN is ‘to take it apart and put back together in a better way.’ What is taken apart is usually a process, system, product, or service.”

The process of implementing KAIZEN can be summarized as follows:
Go to GEMBA = Go to the place where things happen
Watch at GEMBUTSU = Look at what happens
Look for MUDA = Look for waste
Perform KAIZEN = Improve something good to make it even better

Before we move on to the life cycle of Business Process Management, let us take a closer look at what ‘model’ and a ‘workflow’ are. A ‘model’ is a multi-dimensional representation of reality capturing a moment in time. A model has purpose, perspective, audience, content, level of detail and phases as related to a life cycle. A model conveys a message and summarizes information. A Business Process Model describes the details about the way a business conducts its work. A workflow is an integral element of Business Process Management. Workflow is a term used to describe work definition, allocation, and scheduling. It defines the sequence and conditions based upon which steps consisting of work, flows. Workflow handles the routing of work between resources [people, systems, or machines]. Workflow manages the order in which these steps are handled. Workflow also enables employees to monitor and, reconfigure the flow of a business process as needed.

From a high-level view, the life cycle of Business Process Management consists of Process Mapping, Process Deployment, and Process Improvement.

Process Mapping consists of Process Discovery and Process Design. Process Discovery consists of identifying the key processes and defining the rules and roles for each process. Process Design involves modeling the process with its rules and roles on to the system.

The key focus of Process Deployment is integrating participating systems and training the different stakeholders of the processes.

Process improvement involves analysis and optimization. Analysis identifies bottlenecks in the processes. Analysis also measures the time taken per work step, per person and per process. The role of optimization is to redesign processes so that bottlenecks identified during analysis are removed.

A key component of BPM is Business Activity Monitoring (BAM). As the name implies, it is essentially a facility to enable automated monitoring of business process activity related to an organization. Before a BAM facility is put in place, it is very important to define the Key Performance Indicators (KPI) that need to be tracked using BAM. This will prevent information overload and overreaction to business exceptions. Once the KPIs are defined, a system need to be created that allows monitoring and responding to changes, ideally in real time. Business Activity Monitoring allows an organization to respond faster to new opportunities and threats appearing in the business scenario. The core concept of Business Activity Monitoring is recognizing an enterprise’s key performance indicators and implementing the right technology in place to monitor them. A typical BAM system provides real-time, graphical Key Performance Indicators and analysis, enables control, and manages ongoing business operations using closed-loop visibility. It also enables zoom in on cross-process metrics with real-time analysis to determine which processes are creating bottlenecks or which customer is most profitable. BAM also enables organizations to respond quickly to change based on business events in real-time.

BPM makes it easy for enterprises to program their current processes, automate their execution, monitor their current performance, and make on-the-fly changes to improve the current processes. BPM software enables you to automate those tasks that are currently being performed manually. Many of these tasks require some type of application process, approval or rejection process, notifications and status reports. Handling exceptions is an area where BPM really excels. Organizations have few problems when their processes run smoothly ninety-nine percent of the time. However, it is the one percent, where the exceptions are, that dominate the majority of the organization’s time and resources. BPM is ideal for processes that extend beyond the boundaries of an enterprise and communicate with processes of the partners, customers, suppliers, and vendors. BPM gives companies the agility to stay competitive and reduces the time elapsed in a business process. BPM also increases the productivity per person. Business process consists of many steps and a typical BPM initiative reduces the number of steps by half. A business process needs many people and resources and a good BPM should reduce the number of resources needed for the same business process. BPM also helps improve coordination across departments and geographic locations of an organization.

This concludes the blog post on Business Process Management (BPM). Thanks for your interest.

Saturday, March 13, 2010

Human Resource Management (HRM)


"People are our most valuable asset" has become a cliché these days, but it will be very hard to find an organization that will disagree with this statement. In this blog post, we will look at some of the aspects of Human Resource Management (HRM). A complete coverage of all aspects of Human Resource Management is, however, beyond the scope of this blog post.

According to Wikipedia, Human Resource Management (HRM) is the strategic and coherent approach to the management of an organization's most valued assets - the people working in the organization, who individually and collectively contribute to the achievement of the objectives of the business. Wikipedia goes on to explain that Human Resource Management involves employing people, developing their capacities, utilizing, maintaining, and compensating their services in tune with the job and organizational requirement.

The rate of change in business scenarios in recent times has increased largely and in order to be successful, organizations need to absorb and manage change at a fast rate than in the past. This implies that organizations faced with the need to respond to changing business scenarios should implement a successful business strategy and the organization should be staffed with the right people capable of implementing the business strategy. Hence, recruitment becomes a priority for any organization and is often considered a key human resource management activity. Finding the right kind of people to be bought ‘on board’ is often an expensive activity and the job market for qualified candidates is very competitive. Further, new employees can sometimes disrupt the activities of existing employees and new employees take time to synchronize with the work culture, product knowledge, and process knowledge of the organization. Briefly, the recruitment function of Human Resource Management can be described as the process of ensuring that at all times the business is correctly staffed by the right number of people with the skills relevant to the business needs.

The recruitment function ensures that the right numbers of the right kind of people are bought into the organization at the appropriate time. Once the employees join the company, the focus is to retain them and keep them motivated to perform at their best, in tune with the business needs of the organization. As discussed above, recruitment is often an expensive process in terms of time, cost, and effort and hence it is very important to retain the recruited employees. For retaining good employees and to motivate them to perform well, careful and continuous attention need to be paid to the tangible and intangible rewards offered by the organization. Basic rewards and conditions of work like number of hours to be put in per week may be decided by regulations prevailing in a country. In general, it can be said that about half of the rewards and terms of conditions are negotiated by the human resources department and the employee and hence varies from organization to organization. Good personnel policies, which guarantee good work environment and employee benefits, are crucial in motivating and retaining employees. It is important to keep in mind the limitations of money as a motivator and the importance of factors like job satisfaction, avenues for professional growth, involvement, etc., while planning for activities aimed to improve employee motivation. It is an acknowledged fact that the influence of behavioral science discoveries is becoming important in employee motivation. Hence, it is essential that Human Resources department acts as a source of information for the application of the findings of behavioral science in educating managers about the new perspectives of job design, work organization [Job design and work organization is the specification of the contents, method, and relationships of jobs to satisfy technological and organizational requirements as well as the personal needs of job holders] and employee autonomy.

An organization should continuously evaluate the performance of its employees for three reasons:

(a) to improve organizational performance by improving the performance of individual contributors
(b) to identify potential candidates for promotion to higher levels in the organization or for transfer to other positions where better use of employee skills can be made
(c) to provide a basis for linking rewards to performance

A human resource department supports the employee evaluation process in several ways such as:

(a) designing and establishing an evaluation system suited to the organization
(b) define targets for achievement
(c) explaining how to quantify objectives
(d) introducing self assessment
(e) eliminating complexity and duplication
(f) providing training related to employee evaluation system
(g) monitoring the evaluation system

Another key function of human resources department is employee education, training, and development. Employee education can be defined as preparing the employee for training, training involves the systematic development of attitude, knowledge, skill pattern required by a person to perform a given job adequately, and employee development is the growth of the individual in terms of ability, understanding, and awareness.

Employee education, training, and development are needed in an organization in order to:

(a) develop employees to undertake higher job positions in terms of responsibilities
(b) provide training for new employees
(c) raise efficiency and standards of performance
(d) meet legal requirements
(e) as a means to inform employees

Evaluation of the effectiveness of training is done to ensure that it is cost effective, to identify needs to modify what is being provided, to reveal new training needs, and to redefine priorities and most of all to ensure that the objectives of the training are being met.

This brings us to the end of this blog post on Human Resource Management (HRM). Thank you for your interest.

Saturday, March 06, 2010

Supply Chain Management (SCM)

After covering Customer Relationship Management (CRM) and Enterprise Resource Planning (ERP) in previous blog posts, let us now look at some of the basic concepts of Supply Chain Management (SCM).

What is a supply chain? In simple terms, if the business of a company involves creating a product from parts bought from suppliers and selling the product to customer, it can be said that a supply chain exists. Supply Chain Management (SCM) describes the management of flow of materials, information, and funds across entire supply chain, from suppliers to component manufacturers to product assemblers/integrators to distribution of finished products, and finally to the customer. SCM can also be extended to include after-sales service, product returns, and recycling. The complexity of supply chain will vary with the size of the business and the intricacies and number of products manufactured.

Supply Chain Management (SCM) is not necessarily a business function. It is considered as a new business model necessary for an organization’s success and calls for the involvement of every member of the organization. In today’s business scenario, there is a need to be more socially and environmentally responsible while doing business, which results in more risks that need to mitigated and managed. This, coupled with ever-increasing customer requirements and expectations, globalization, pressure on cost and lack of availability of resources has increased the difficulty level of doing business. It is under these circumstances that managers are expected to improve profitability, increase revenue growth, capture and protect larger market share. In order to succeed under these conditions, companies must recognize that the ultimate success of an organization depends on the ability to integrate the organization’s network of business relationships in a mutually beneficial manner. The efficient management of this network of business relationships is Supply Chain Management (SCM).


A supply chain consists of several elements or components, which are connected by the movement of products along it. The customer is at both the ends of the supply chain – the supply chain starts with the customer deciding to buy a product and the cycle is completed when the product is delivered to customer, accompanied by the invoice [An invoice is a commercial document issued by a seller to the buyer, indicating the products, quantities, and agreed prices for products or services the seller has provided the buyer] for the product.
 
Let us now take a closer look at the different components of the supply chain.

Customer: As already discussed, the supply chain starts when a customer decides to buy a product offered by a company. Once the decision is made, the customer contacts the sales division of the company and places an order. The sales department creates a sales order, which specifies the type of the product(s), the required quantity, and the delivery date specified by the customer. If the product involved needs to be manufactured, the sales order will include a requirement that needs to be fulfilled by the production department.

Planning: Each sales order generated in response to customer request will trigger a requirement. Such requirements from all the sales orders will collated by the planning department. The planning department will then create a production plan to manufacture the products to fulfill the customers’ orders. Manufacturing the products often requires purchasing of raw materials.

Purchasing: The purchasing department is responsible for arranging the purchase of raw materials required for the manufacturing of products to fulfill customer orders. Based on the inputs from planning department, the purchasing department sends purchase orders to suppliers to deliver necessary raw materials on the required date as per production plans.

Inventory: The inventory division is responsible for tracking delivery of raw materials from suppliers, ensuring quality, and quantity of received materials and for moving the materials to warehouse. The storage of materials until the production department requires them is also the task of the inventory division. Suppliers also send invoice for the materials delivered to the company.

Production: The inventory division moves the raw materials from the warehouse to the production area, based on the production plan prepared by the planning department, as explained above. The production division manages the manufacture of products ordered by the customers, from the raw materials, which has been moved to the production area. After the manufacturing process is complete, the products undergo testing before being moved back to the warehouse, where the products will be stored until they are delivered to the customer.

Transportation: Once the finished and tested products arrive at the warehouse, the transportation (shipping) department identifies the most efficient way to ship the product so that it arrives on or before the date specified by the customer while ordering the product. The invoice for the finished goods is also delivered to the customer along with the goods.

This brings us to the end of this blog post on Supply Chain Management (SCM). Thank you for your interest. 

Saturday, February 27, 2010

Enterprise Resource Planning (ERP)


In the previous blog post, we discussed Customer Relationship Management (CRM). In this blog post, let us look at some of the fundamental aspects of Enterprise Resource Planning (ERP). In general, CRM system can be considered as a sub-set of the features of an Enterprise Resource Planning system.

ERP is an industry standard acronym for Enterprise Resource Planning. ERP is an Information Technology (IT) supported system, used to integrate the data and processes of an organization in a seamless fashion. In the earlier days of ERP, the term ERP was used to refer to the way large organizations planned to use their organizational wide resources. Today, ERP systems are used in all types of organizations, from small to medium sized and large organizations.

In the earlier days of computerization, core functions of an organization, like Customer Relations Management, Human Resources, Supply Chain Management, and Financials were all supported by stand-alone IT systems. This often resulted in duplication of data and the need for complicated data transfer protocols between systems. In such systems, any data mismatch during transfer can result in problems. From a database management and administration perspective also, it is often recommended to avoid duplication of data.

Current ERP systems are capable of covering a wide range of functions and integrating them into a single, unified database. A single, unified database removes the difficulties associated with transferring data between independent systems and duplication of data as discussed above. Enterprise Resource Planning systems can help in the management of many business activities, like sales, marketing, delivery, billing, production, inventory management, quality management, and human resource management, through a single system. ERP systems are sometimes referred to as ‘cross functional enterprise wide systems’ since all functional departments in an organization are managed through a single system.

The most important advantage of ERP system is often cited as the system’s ability to bring down operating costs and saving valuable time that would otherwise be wasted in manual procedures and unwanted delays. An ERP system also ensures faster processing of information, reduces the burden of documentation and associated manual workflows, avoids repeated data entry, and reduces cycle time. Another major advantage is efficient Customer Relationship Management. Customer queries and complaints can be tracked to closure very efficiently resulting in high levels of customer satisfaction – a key parameter in evaluating the performance of any organization. [In the previous post of this blog, we had a closer look at Customer Relationship Management systems]

ERP systems ensure that access to sensitive data of the organization is controlled in role-based manner. Thus, data is made available only on a ‘need-to-know’ basis, thereby plugging chances of leaking sensitive data.

ERP systems eases project management, enables better tracking of work in progress, enables quick creation of status reports and reduces process cycle time. ERP systems also act as ‘Decision Support Systems’ ensuring that decision can be made on the basis of up-to-date information. Besides, ERP systems have also resulted in better vendor and supply chain management. Automated work flows in ERP systems allow organizations to track and identify bottlenecks in process flows and make improvements.

Even though the merits of ERP systems often out weighs the demerits, the system and its adoption by organizations is not without disadvantages. The adoption of ERP by an organization is often referred to as ‘implementing ERP’ since ERP systems usually require customization based on the needs of the organization. 

First, an ERP implementation calls for a large investment in time and money. Next, the success of an ERP implementation depends on how well the employees of an organization understand the system and uses it regularly in their day-to-day business activities. This calls for heavy investments in training of employees. Besides the cost of training, employees engaged in training will often mean that regular business activities are sidelined, resulting in loss of revenue and business opportunities. These advantages imply that an organization needs to carefully plan and compare disadvantages against advantages before deciding to implement an ERP system.    

The implementation of an ERP system does not guarantee solutions to all the problems that an organization is facing. In fact, if the implementation is not carefully planned and the cutover from existing systems to ERP is not orchestrated in a fine manner, ERP implementations can result in more trouble. Still, a well-planned ERP implementation coupled with proper employee training and orientation will definitely enable an organization to compete globally in ever changing business scenarios. To sum it up, such a carefully planned ERP system is often considered as the perfect commercial embodiment of the verse: “Think Global. Act Local.”

This brings us to the end of this blog post on Enterprise Resource Planning (ERP). Thank you for your interest. 

Saturday, February 20, 2010

Customer Relationship Management (CRM)


"The purpose of business is to create and keep a customer." --Peter Drucker
Customer Relationship Management (CRM) enables businesses to do exactly that – to create and keep a customer. Customer Relationship Management Systems are technology assisted systems, which enables enterprises to create and retain customers.

In this blog post, we will have a look at some of the basic concepts of Customer Relationship Management.  

There are several definitions for Customer Relationship Management but the most common one seems to be: CRM System involves the alignment of people, processes and technologies that help an enterprise manage customer relationships in an organized way. The aim of CRM is to build a stronger relationship with customers, which will lead to build both customer loyalty and increased profits.

Customer Relationship Management helps an organization to:
a) assist its marketing department in identifying their best customers for repeat business, manage marketing campaigns, and generate leads, which have a high chance of conversion into sales, for the sales team.
b) improve telesales, account management, and sales management by optimizing information shared by multiple employees
c) develop personalized relationships with customers, with the aim of improving customer satisfaction and maximizing profits; to identify the most profitable customers and provide them the highest level of service.
d) equip employees with the information and processes necessary to know their customers, understand and identify customer needs and effectively build relationships between the company, its customer base, distribution partners, and vendors.

The marketing department plans and runs marketing campaigns, which are programs and activities by which companies advertise products and services to potential customers. Different types of marketing campaigns include:
(i) Awareness Campaigns – used to increase awareness of a brand
(ii) Brand Campaigns – generally used by new companies to connect brand with services and offerings
(iii) Lead Generation Campaigns – used to collect contact information for use in direct marketing
(iv) Customer Loyalty Campaigns – used to recognize and reward regular customers

The leads generated by marketing campaigns is shared within the sales and marketing team. These leads will be contacted separately and depending on the response of the contact, the leads will be classified based on their probability of becoming a potential customer or prospect customer.

Once the lead gets elevated to the status of a potential customer or prospect customer [based on criteria set by the organization], the sales team aggressively keeps in touch with the contact until the potential customer is won (potential customer becomes a customer) or lost(potential customer decided not to go with the offering). The sales team is also responsible for providing details of the products/services offered by the organization. They also manage quotes/estimates and related negotiations. Other terms and conditions of the deal are also tracked and managed by the sales team. All information pertaining to activities carried out in relation to a lead/potential customer like e-mails, telephone calls and meetings and tracked by the sales team. Appropriate follow-up actions are also carried out and monitored by the sales team. If the potential customer or prospect customer is lost the circumstances are noted and will be analyzed later to avoid repeating such losses.

Once a potential customer agrees to buy a product/service from the company, the potential customer gets elevated to the status of ‘customer’ or ‘client’. In some organizations, the first time customers are referred to as ‘customer’ and from the second time, the customers are referred to as ‘client’.

Developing personalized relationship with customers is a key focus area for any CRM system. For this, every customer is assigned to a team member within the organization who is primarily responsible for maintaining the customer relationship in a cordial fashion. This involves major tasks like keeping track of resolution of customer complaints to relatively minor tasks like sending wishes/gifts to the customer on his/her birthday, wedding anniversary and during other events which have a personal significance for the customer. Such notes from the organization, though relatively minor from a CRM perspective, can often have profound impact on the customer in terms of ‘feel good’ factor.

Keeping a customer satisfied is not only good for repeat business from the same customer, but also in terms of ‘word of mouth’ publicity. Customer Relationship Management enables organizations to understand their customers better, identify customer needs, and build effective relationships between the organization, customers, vendors, and distribution partners. In order to derive full benefits from Customer Relationship Management Systems, they need to be tuned to specific needs of the industry. Studies have shown that careful implementation of CRM System and their diligent use has resulted in increase of sales volumes of up to and even more than 30%. ["On average, sales and marketing costs average from 15% - 35% of total corporate costs. So the effort to automate for more sales efficiency is absolutely essential. In cases reviewed, sales increases due to advanced CRM technology have ranged from 10% to more than 30%." --Harvard Business Review]

There are quite a few CRM Systems in use across different industry verticals. Open source CRM Systems are also becoming popular. We will not go into the details here as it is beyond the scope of this blog post.

Let us conclude this blog post with another quote by Peter Drucker: 
“We've spent the last 30 years focusing on the ‘T’ in IT[Information Technology], and we'll spend the next 30 years focusing on the ‘I’. ”  

And CRM is all about focusing on the ‘I’ – Information. 

~ Sunish


Monday, February 08, 2010

An Introduction to e-learning

"Online learning will rapidly become one of the most cost-effective ways to educate the world's expanding workforce."
--Jack Messman

Online learning, also known as e-learning, is fast becoming a preferred training mode in industry and academia alike. In this blog post, we will look at some of the basic concepts, advantages of e-learning, and explore the use of technology in e-learning.

Online learning can be asynchronous. Being asynchronous means that learners determine when and how to access online learning content. This is in contrast with the synchronous model of training where learners generally move through content in a pre-determined sequence.

Another characteristic of online learning is that it is available ‘on demand’ and ‘just in time’. Online learning content is often customized and personalized as per preferences of the learner. The ‘just in time’ delivery model allows the content to be continuously updated resulting in content relevant to the context.

Online learning is learner controlled. This implies that the learner has the option to pause and play content at the learner’s pace. This also allows the learner to reflect on content learned before moving on to later modules.

The content used for online learning is designed to be re-usable. ‘re-usable’ in this context means that basic units of content can be re-assembled to generate different types of content, suited to different needs of the intended audience.

Online learning is also designed to be platform independent. Content can be transformed into a variety of formats like XML, HMTL, PDF, e-book, etc., resulting on the same content being easily available across different platforms.

Online learning also allows learners across the globe to collaborate in real time resulting in a highly interactive learning experience. Online learning when used for distance education enables trainers to interact with a large number of trainees at multiple locations in real-time, resulting in cost-effective training programs.

Moving on to technologies used in e-learning, the online learning industry initially tried to replicate the class room experience online. Later, the industry was guided by the fact that technology is only the delivery mechanism and the industry has focused on the best method of online content delivery that is most comfortable to learners.

The earliest of the e-learning courses were computer based training and web based training. In computer based training, learning content on CD-ROM or other media was distributed to students and the in the case of web based training, content was delivered over the Internet. In both cases, the course was meant to be taken by trainees as an asynchronous, self-paced course. Web based training allows content to be easily updated and if the trainer and trainee are online at the same time, this mode allows interaction. The disadvantages of web based training include requirement of Internet connectivity and if the connectivity rates are high, it can be an expensive option in the case of large multimedia files.

Most of the computer based trainings and web based trainings are structured in a linear fashion where the trainee is expected to follow a single path through the course content. Some courses allow the learner to navigate based on needs or interests. There are also sophisticated courses, in which the path is customized as per trainee need and the progress the trainee makes in the initial stages of the course.

The technologies used for delivering asynchronous e-learning include e-mails and discussion forums. E-mails provide a faster means of traditional correspondence course. E-mails also act as a support medium in the case of learning management systems that allow uploading and sharing of content. Discussion forums provide a mechanism for discussion on specific course topics as well as informal exchanges related to course delivery. ‘Threading’ is a feature, which allows discussions to be grouped together, making it simpler to find related postings and responses. Threaded discussions are often also collapsible and expandable to allow students to manage the number of posts shown on the screen at a time and to facilitate browsing groups of posts.

Audio conferencing (using telephone or VoIP [Voice over Internet Protocol]), electronic white boards, instant messaging, text chat, video communication, and web casting are some of the technologies that support delivering synchronous e-learning courses.

Audio conferencing allows a group to interact in real time by sharing voice accompanied by slides or text. Audio quality is often a bottle neck while using this mode of delivery since poor audio quality will lead to a poor classroom experience for the trainees. The length of audio conferencing sessions, similar to traditional classroom lectures, need to be restricted to 1-2 hours. The rest of the technologies we are going to discuss below are used together with audio conferencing to enhance the classroom experience for synchronous e-learning courses.

An electronic white board typically consists of an electronic version the dry-erase boards found in conventional lecture rooms. They are used for free hand writing and drawing, and range from simple graphical editors to sophisticated versions incorporating slide show and other applications.

Instant messaging and text chat allows short and frequent messaging between participants of a synchronous e-learning program. Instant messaging typically involves pairs of individuals whereas in text chat a group of individuals is involved. Instant Messaging and chat tools vary in complexity from simple messaging to complex ones with built-in file sharing and private messaging.

Videoconferencing extends the capability of audio conferencing by the addition of video. Videoconferencing enables instructors to either stream video or enable videoconferencing, between instructors and students, between students, or between multiple classrooms. As in the case of audio, video quality has to be maintained for this mode of delivery to be successful. Streaming video is becoming more widely adopted and is often replayed rather than live.
 
Web casting involves combining one or more the technologies that we discussed above to delivery synchronous learning experience to students.

Before we conclude this blog post on e-learning, let us look at a learning related quote in the context of organizations.

"An organization's ability to learn and translate that learning into action is the ultimate competitive advantage."
--Jack Welch

And e-learning will help organizations learn what they need to know, when they need to know. 


~ Sunish

Sunday, January 31, 2010

Scala Programming Language – An Overview


The Scala programming language belongs to a class of programming languages known as ‘Functional Programming Languages’. Before we proceed further, let us have a quick re-cap of some the core concepts of functional programming languages, which we covered in an earlier blog post titled ‘Functional Programming – An Introduction’.

In Mathematics, ‘functions’ express the connection between parameters (inputs, in the case of computers) and the result (the output, in the case of computers) of certain processes. In each computation, the result depends on the parameters in a particular way and hence a ‘function’ is a good way of specifying a computation. This is the basis of ‘Functional Programming’.

The above notion is also more close to the ‘human world’ than to the world of a computer where in the initial days of computing, programs consisted of instructions to modify the memory, executed by the central processing unit. Thus, functional programming languages match the mathematical idea of functions.

A function is fundamentally a transformation. It transforms one or more inputs into exactly one output.

An important property of functions is that they yield no side effects – this means that the same inputs will always yield the same outputs, and that the inputs will not be changed as a result of the function. Every symbol in functional programming language is immutable.

Functional programming treats computations – running a program, solving a numeric calculation – as the evaluation of functions.

Having covered the key concepts of functional programming, let us move on to the industry scenario that led to the evolution of Scala programming language. Moore’s Law states that CPU speeds will double every 18 months. However, these days the focus is to create CPUs with multiple cores – meaning multiple CPUs within a single chip. This means that the multithreaded environment is executing on more than one CPU simultaneously as opposed to the standard ‘round-robin’ cycle executing on a single CPU. Multithreading on multiple CPUs requires that the code must be highly thread-safe. 


Attempts to resolve this problem of having highly thread-safe code has resulted in many new programming languages that addresses the concurrency problem, each language with its own virtual machine or interpreter. This obviously means that a transition to a new platform is required, similar to what happened when organizations moved from C++ to Java, about a decade ago. Such a transition is a non-trivial task and most companies consider another transition risk prone. This sets the stage for the arrival of Scala programming language.

Scala is a statically typed, object-oriented programming language. In addition to being object oriented, Scala is also a functional programming language and blends the best approaches to object-oriented programming and functional programming. Scala is designed and developed to run on the Java Virtual Machine (JVM) and Scala’s operational characteristics are same as Java’s. In fact, the Scala compiler generates byte codes that are nearly similar to that generated by Java compiler. This compatibility ensures that Scala language can utilize existing Java code, which in turn means that Scala has access to the existing ecosystem of Java code, including open-source code.

In Italian language, ‘Scala’ means stairway or steps. The name ‘Scala’ was selected to imply that Scala programming language allows programmers to ‘step-up’ to a programming environment that incorporates the latest in programming language design and at the same time letting programmers use all existing Java code. Scala also means ‘scalable language’, which means the language is designed to grow with the demands of its users.

Scala has been generating significant interest in the software industry and companies are announcing their move to Scala. Twitter, in April 2009, announced that they have switched a large portion of their backend to Scala and intend to convert the rest. Wattzon has mentioned that their entire platform has been written from the ground up in Scala.

Professor Martin Odersky is the creator of the Scala language. As a professor at EPFL in Lausanne, Switzerland, he is working on programming languages, more specifically languages for object-oriented and functional programming. His research thesis is that the two paradigms are two sides of the same coin, to be identified as much as possible. To prove this, he has experimented with a number of language designs, from Pizza to GJ to Functional Nets. He has also influenced the development of Java as a co-designer of Java generics and as the original author of the current javac reference compiler. Since 2001, Prof. Odersky has concentrated on designing, implementing, and refining the Scala programming language.

Before we conclude this discussion, I would like to quote a reference to Scala from a previous blog post, titled ‘Technology Choices for 2009 and Beyond...’ posted on 24 September 2008.
Another relatively new [first public release in 2003] language, Scala, designed and built by the team led by Prof. Martin Odersky (EPFL, Switzerland) [Prof. Odersky has also influenced the development of Java as a co-designer of Java generics and as the original author of the current javac reference compiler] also seems to be promising. On a related note, in the article titled "Java EE meets Web 2.0" written by Constantine Plotnikov, Artem Papkov and Jim Smith in developerWorks, November 2007), the authors identifies principles of the Java EE platform that are incompatible with Web 2.0 and introduces technologies, including Scala, that close the gap.

This concludes our discussion on Scala programming language, which is expected to transform software engineering, the way Java programming language did about a decade ago. 


~ Sunish

Sunday, January 24, 2010

Open Source Software and Enterprise Computing – An Introduction

If we ask the software fraternity to define ‘Open Source’ in one word, the answer will most likely be ‘collaboration’. To elaborate further, we can define ‘Open Source’ as public collaboration on a software project with contributors from across the globe.

The Open Source Initiative (http://www.opensource.org) provides a ten-point definition of open source, which can be summarized as follows. More information on each of these ten aspects of open source can be found at http://www.opensource.org/docs/definition.php.

1. Free redistribution
2. Source Code
3. Derived Works
4. Integrity of the Author’s Source Code
5. No Discrimination Against Persons or Groups
6. No Discrimination Against Fields of Endeavor
7. Distribution of License
8. License Must Not Be Specific to a Product
9. License Must Not Restrict Other Software
10. License Must Be Technology-Neutral

Some of the reasons that make ‘Open Source’ important are:

(a) A community process based approach, which influences the technical leadership to accommodate a collaborative approach.
(b) Open Source can be a major source of innovation, with collaborators beyond physical boundaries participating in open source projects
(c) Wide distribution and deployment of standards, which evolve from Open Source
(d) Increases choice and flexibility for enterprise customers.

We will focus the rest of this discussion on Open Source Computing and its adoption by Enterprises.

There is little doubt that Open Source Software is experiencing explosive growth and coupled with that growth, adoption of Open Source Software by enterprises is growing. Some of the factors that are prompting enterprises to adopt Open Source Software are:

1. Reducing IT budgets
2. Increasing Software Licensing Costs
3. Move toward Integrated Systems – one system for all Enterprise Users
4. Move to Web 2.0 initiatives to support marketing and enhance customer relationship management.

Given the above background, the key factors that push adoption of Open Source in Enterprises are:

(a) Cost: Reduced budgets obviously results in measures that will save costs. Overall Information Technology costs can be reduced by implementing free or low cost Open Source Software.
(b) Innovation: Open Source can be used to create new business offerings or innovative operation models, with substantial reduction in costs.
(c) Agility and Scale: Open Source Software provides the ability to quickly scale up and modify software systems to meet rapidly changing business requirements.
(d) No vendor lock-in: Reduces dependence on proprietary software vendors
(e) Quality and Security: Improves the operational efficiency of enterprise architecture by leveraging the open source characteristics of transparency and rapid improvement.

Some of the Open Source characteristics that make it particularly suitable and appealing to Information Technology organizations are:

1. Ability to inspect and modify source code: Open source mandates the availability of source code. This enables the enterprise adopters to inspect the source code to gain a better understanding of the software. It also helps in integrating Open Source Software with other systems. The ability to modify the source code enables enterprises to add new features and functionality. It also helps in adding security related modifications to meet the organization’s Information Security Audit requirements.

2. Development Transparency: Development Transparency means that the development process is carried out in public with all code changes available for inspection. It is relatively easy for a user to ascertain the current state and history of an open source product. Testing is also carried out on a large scale by collaborating developers, reported bugs are listed, and bug status maintained.

3. Liberal Licensing Terms: Proprietary Software licenses are restrictive in nature with limits on installations, simultaneous users (floating licenses), fixed number of users, etc., and obviously, there is a fee associated with such licenses. On the other hand, Open Source licenses are expansive in nature and encourage wide spread use (please see definition of Open Source at the start of this blog post). Open source licenses do not impose limits like fixed number of users and number of installations. Acquiring Open Source Software is also free. Service providers may charge fees for services like customization, security audit, testing, etc., but for accessing the software fees are not involved.

This concludes our discussion on ‘Open Source Computing and Enterprise Computing’. 


~ Sunish

Sunday, January 17, 2010

Functional Programming - An Overview

Let us start this blog post on ‘Functional Programming’ with a widely accepted definition of computer programming – “computer programming is the process of creating a sequence of instructions which will enable a computer to do something”. Computer programming is a means to translate problems in the real world that need solving, into a format that computers can process.

Computer programming languages help convey instructions to computers. The goal of programming languages is to translate human language to machine code, the native language that computers understand.

Before we move on to have an overview of functional programming, let us have a look at the different types (or paradigms) of programming languages. Please note that a given language is not limited to the use of a single paradigm, a classic case is that of Java programming language that has elements of both procedural and object oriented paradigms.

a) Procedural Programming Languages: These languages specify a list of operations that a program must execute to reach a desired state. Each program will have a starting state, a list of operations or instructions to complete and an ending state. Two popular examples of procedural programming languages are BASIC (Beginners All purpose Symbolic Instruction Code) and FORTRAN (The IBM Mathematical FORmula TRANslating System).

b) Structured Programming Languages: Structured programming can be considered as a special type of procedural programming, which requires the program to be broken down into small pieces of code, thereby increasing readability. Local variables (local to each subroutine) are preferred over global variables. These languages support a design approach called ‘top-down approach’ in which the design starts with a high-level overview of the system. System designers then add more details to the components in an iterative fashion until the design is complete. Popular languages include Pascal, Ada and C.

c) Object Oriented Programming Languages: This paradigm is the latest and considered the most powerful of all programming language paradigms so far. Here, system designers define both the data structures and the type of operations that can be applied to those data structures. This pair of data and operation(s) on the data is known as an object. A program can then be viewed as a collection of objects, which interact with one another. The important concepts associated with the object-oriented paradigm include classes/templates, inheritance, polymorphism, data encapsulation and messaging. However, a detailed note on these concepts is beyond the scope of our current discussion. Popular languages following this paradigm include Java, Visual Basic, C#, C++ and Python.

d) Functional and Other Programming Languages: The fourth list includes functional programming and other paradigms like concurrent programming and event driven programming, which is not included above.

We will now return to the focus of our discussion – Functional Programming.

In Mathematics, ‘functions’ express the connection between parameters (inputs, in the case of computers) and the result (the output, in the case of computers) of certain processes. In each computation, the result depends on the parameters in a particular way and hence a ‘function’ is a good way of specifying a computation. This is the basis of ‘Functional Programming’.

The above notion is also more close to the ‘human world’ than to the world of a computer where in the initial days of computing, programs consisted of instructions to modify the memory, executed by the central processing unit. Thus, functional programming languages match the mathematical idea of functions. Functional programming is a new approach to solve certain classes of problems, which we will cover later in this discussion.

The main characteristics of functional programming are as below:

(a) power and flexibility – many general, real world problems can be solved using functional constructs
(b) simplicity – most functional programming languages have a small set of key words and concise syntax for expressing concepts
(c) suitable for parallel processing – with immutable values and operators functional programs are more suited for asynchronous and parallel processing

Since the concept of ‘functions’ is core to Functional programming, let us define a function before we proceed further.

“A function is fundamentally a transformation. It transforms one or more inputs into exactly one output”.

An important property of functions is that they yield no side effects – this means that the same inputs will always yield the same outputs, and that the inputs will not be changed as a result of the function. Every symbol in functional programming language is immutable.

Functional programming treats computations – running a program, solving a numeric calculation – as the evaluation of functions.

Some of the classes of problems, which can benefit from a functional programming approach, are as below:

(i) multi-core and multi-threaded systems
(ii) sophisticated pattern matching
(iii) image processing
(iv) machine algebra
(v) lexing and parsing
(vi) artificial intelligence
(vii) data mining

Advantages of Functional Programming

(a) Unit Testing: We have already noted that every symbol in a functional programming language is final and hence immutable. This implies that no function can modify variables outside of its scope and hence there are no side effects caused by functions. This also implies that the only effect of evaluating a function is its return value and the only thing that affects the return value of function is its arguments (Please see the definition of ‘function’ above). This makes unit testing much easier since the boundary values of arguments need only be unit tested.

(b) Debugging: The absence of side effects as explained at (a) above makes debugging easier since bugs are local to a function. An examination of the stack quickly reveals the cause of error.

(c) Concurrency: Functional programming does not allow data to be modified by two different threads or twice by the same thread. Hence, there is no scope for deadlocks and race conditions. This allows ease of programming in concurrent systems.

Apart from being a more appropriate tool for certain classes of computing problems, functional programming also allows programmers to make more efficient use of multi-core systems, develop concurrent/parallel algorithms easily and utilize the growing number of cloud computing platforms.

Functional programming is also considered as a means for programmers to improve their problem solving skills; it also allows programmers to look at problems from a different perspective and become more insightful object-oriented programmers as well.

Popular functional programming languages include LISP, Haskell and F#. 


~ Sunish

Monday, January 11, 2010

Scratch – A route to fluency with new technologies


In the current scenario across the globe where technology is an integral part of our lives, the younger generation is often referred to as ‘Digital Natives’ because of their apparent fluency with digital technologies. Please note the use of the expression ‘apparent fluency’. This is because although young people are comfortable sending text messages (SMS), playing online games and browsing the web, such activities do not seem to make youngsters ‘fluent’ with digital technologies in the real sense of the word. To reiterate, despite the constant interaction of young people with digital media, few of them can create their own games, animations or simulations. In short, if digital technology is considered as a language, it is as if youngsters can “read” the language, but cannot “write” or express themselves using digital technologies.

This set the stage for the Scratch team that created the Scratch programming language. When the Scratch team started off in 2003 to create the language they had set a goal to develop an approach to computer programming that would appeal to people who had not previously imagined themselves as computer programmers. The team’s aim was to make it easy for everyone, of all ages, backgrounds, and interests, to program their own interactive stories, games, animations and simulations; and to share their creations with other programmers.

The Scratch programming language was released to the public in 2007 and since then the Scratch website (http://scratch.mit.edu) has become a very active online community where people share, discuss and remix scratch programming projects. The collection of projects is quite diverse - birthday cards, video games, interactive tutorials, virtual tours and many others, all programmed in Scratch programming language. The core audience on the Scratch website is between the ages of 8 and 16 though there is a sizeable group of adult participants as well.

As users of the Scratch website program and share interactive projects, they:

1.    learn mathematical and computational concepts
2.    learn to think creatively
3.    reason systematically and
4.    work collaboratively

The above skills are often considered essential skills for the twenty first century. In fact, the primary goal of the team that created Scratch was not to prepare people for careers as professional programmers, but rather to nurture the development of a new generation of creative, systematic thinkers who are comfortable using programming to express their ideas. Further, digital fluency requires not just the ability to chat, browse and interact, but also the ability to design, create and invent with new media.

When personal computers were first introduced in the early 1980s, there was a lot of enthusiasm for teaching all children how to program. The commonly used languages were LOGO [Logic Oriented Graphic Oriented] or BASIC [Beginners All Purpose Symbolic Instruction Code]. (My school taught computer programming in 1988 in BBC BASIC, a variant of BASIC for BBC Microcomputers).

The main factors that prevented the initial enthusiasm from being long lasting were:

1.    Difficulty in mastering the syntax of programming
2.    Programming based on scientific/mathematical activities that did not generate enough interest in children

Based on these past programming initiative experiences, the Scratch team established three core design principles for Scratch:

1.    more tinkerable
2.    more meaningful
3.    more social
 

1.    More Tinkerable: The Scratch grammar is based on a collection of graphical “programming blocks” that children snap together to create programs. Connectors on the blocks suggest how they should be put together. Children can start by tinkering with the blocks, snapping them together in different sequences and combinations to see what happens. There is none of the obscure syntax or punctuation of traditional programming languages. It is easy to get started with and the experience is playful.

Figure1: Sample Scratch Scripts

Scratch blocks are shaped to fit together only in ways that make syntactic sense. Control structures like ‘forever’ and ‘repeat’ are C-shaped to suggest that blocks should be placed inside and to indicate the concept of scoping. Blocks that output values are shaped according to the types of values they return: ovals for numbers and hexagons for Booleans. Conditional blocks (if and repeat-until) have a hexagon shaped voids, indicating a Boolean is required.

2.    More Meaningful: It is widely accepted that people learn best, and enjoy it most, when they are working on personally meaningful projects. While developing Scratch the team had attached a high priority on:

a.    diversity – supporting many different types of projects such as stories, games, animations, simulations, etc., so that people with widely varying interests can all work on projects that they care deeply about.
b.    personalisation – making it easy for people to personalize their scratch projects by importing photos and music clips, recording voices, creating graphics.

3.    More Social: The development of the Scratch programming language has been tightly coupled with the development of the Scratch website. From the Scratch team’s perspective, for Scratch to succeed, it had to be linked to a community, where people could support one another, collaborate with one another, critique on one another and build on one another’s work. The concept of sharing is built right into the Scratch User Interface, with a prominent Share menu and icon at the top of the screen, which allows the project to be uploaded to the Scratch website. Once a project is on the website, anyone can run the project within a browser, comment on the project, and vote for the project or download the project to view and revise the scripts. All projects shared on the website are covered by Creative Commons license.

Looking at future directions of Scratch programming language, following are few of the major directions in which the project will be moving ahead:

1.    More tinkerable, More Meaningful and More Social
2.    Scratch Sensor Board – for interacting with the physical world
3.    Scratch for mobile devices
4.    Web based version of Scratch
5.    Scratch-Ed – for Scratch educators; to share ideas, experiences and lesson plans

This brings us to the end of this blog post on Scratch, which is on a mission to expand the notion of digital fluency.

Thanks for your interest and for reading this blog post.


~ Sunish

Wednesday, January 06, 2010

Cloud Computing - Part Two


In part two of this two part blog post on cloud computing, we will cover:

1. Concerns related to cloud computing
2. Factors which can accelerate wide spread adoption of cloud computing

1. Concerns related to cloud computing

(a) Security: One of the biggest concerns related to cloud computing is security. This is because sensitive data may no longer reside on dedicated hardware, secured within the enterprise’s own data centers. If the cloud is not secure enough enterprises will hesitate to migrate their business related data to the cloud platform.

(b) Poor Service Level Agreements: Service Level Agreement (SLA) is an integral part of the business relationship between a service provider and a customer. An SLA is essentially a contract between a service provider and a customer which clearly defines the business relationship, assures the customer that the service will meet stated requirements, and provides contingencies in case issues arise.

Due to poor or non-existent Service Level Agreements, cloud computing confidence and adoptions is affected. Most Enterprise IT organizations will not adopt cloud services on a large scale until service levels can be clearly spelled out and backed up. For many IT organizations Service Level Agreements are a requirement to use any vendor’s service since the absence of an SLA puts the business at risk from an operational, financial, or liability standpoint.

The main issues commonly found in cloud computing related Service Level Agreements are:

•    Lack of guaranteed availability
•    Lack of guaranteed performance
•    Lack of guaranteed support and response time


(c) Inadequate Risk Assessment: Risk Assessment and Management is often considered the greatest concern in cloud computing. Risks associated with cloud computing can be generally classified into:

  (i) Legal, compliance and reputation risks
  (ii) Operational risks

Legal, compliance and reputation risks can result from cloud computing vendors leaking, losing, breaching, damaging or impeding access to various types of sensitive or valuable information. When information is leaked, damaged, or lost by a cloud computing vendor, the customer organization may face legal or regulatory consequences for which there is little recourse. Cloud customers are unlikely to repair the reputation damage by transferring the responsibility to the cloud vendor.

The majority of the operational risks for cloud computing services are related to IT security, performance or availability. Small to medium sized organizations could see a net gain operational security by using a professional cloud computing service. However, larger enterprises may see lower levels of security in the areas of strong encryption, access control, monitoring and physical separation of resources.

(d) Vendor Lock-in: Vendor lock-in is a real and major concern in cloud computing. The factors that lead to vendor lock-in are:

    (i) Lack of interoperability between cloud services
    (ii) Inability to migrate to other cloud services
    (iii) Vendor management limitations at the customer’s end

(e) Management Issues: There are two management issues often associated with cloud computing – performance monitoring & troubleshooting and data management. Many cloud computing service providers do not provide adequate tools for performance monitoring. Many vendors also do not have the ability to effectively trouble shoot when issues arise. Similar to performance monitors some vendors do not provide tools for meta-data manipulation or extraction of data.

2. Factors which can accelerate wide spread adoption of cloud computing

(a) Expenditure and ROI: As mentioned in part one of this post, cloud computing enables customers to defer large capital expenditure. This will probably be the biggest factor which will drive the wide spread adoption of cloud computing. The current model is to buy as much infrastructure as is needed to meet estimated peak capacity and in most cases this results in under-utilized IT resources. Cloud computing offers the ability to scale up and scale down as per demand and a pay-as-you-go business model where the customer pays only for the services actually used. In financial terms, this translates into less capital expenditure and more operational expenditure. The advantage of operational expenditure is that it can be fine tuned as per need, thereby resulting in more efficient utilization of financial resources and better return on investment (ROI).

(b) Wide spread Mobile Internet Access: It is fair to assume that in another 5 to 6 years, significant progress will be made in the field of Internet connectivity resulting in the ability to connect to the Internet at all places where it is possible to connect to a mobile telecommunication tower. Further, the spread of 4G wireless standards will bring broadband Internet access to remote locations and will introduce true broadband connectivity to automobiles, trains and even commercial aircrafts. This will boost cloud computing acceptance as internet access is a pre-requisite most for cloud computing models. Another factor which will help acceptance of cloud computing is the availability of smart phones and net-books which help mobile users connect to the Internet.

(c) Offline Access for Online Applications: Google Mail or GMail is a commonly cited example where an online application is available for offline use when there is no Internet connectivity. This allows the user to continue working while being disconnected from the online application, hosted on a cloud computing platform. On restoration of Internet connectivity changes made to the offline version are synchronized with the online version of the application. For cloud computing applications, this means that Internet connectivity is not always required for users to work with the application.

(d) Separation of Data from Applications: In application development, it is becoming increasingly common practice to separate data from applications. For enabling users to connect with minimum of system pre-requisites, application front ends are being delivered via web pages which can be accessed from any browser. The backend is maintained separately, powered by highly scalable databases. Factors like WAN (Wide Area Network) speeds of over 100 Mbps, decreasing bandwidth costs and WAN acceleration technologies will assist the separation of data and applications.

This concludes part two of this two part blog post on “Cloud Computing”.


~ Sunish




Cloud Computing - Part One

Cloud computing, which extends the enterprise beyond the traditional data center walls is quietly winning over CIOs across the world. Cloud computing not only offers a viable solution to the problem of addressing scalability and availability concerns for large-scale applications, but also displays the promise of sharing resources to reduce cost of ownership. The concept has evolved over the years starting from data centers to present day infrastructure virtualization.  Although Cloud Computing is bringing about major changes in the way traditional IT infrastructure is being managed, it is still not mature enough for wide spread adoption in the IT industry.

We will try and look at a few aspects of Cloud Computing such as:

1.    What is cloud computing?
2.    Advantages of cloud computing
3.    Concerns related to cloud computing
4.    Factors which can accelerate wide spread adoption of cloud computing

In part one of this two part blog post we will cover:

1. What is cloud computing?
2. Advantages of cloud computing

1. What is Cloud Computing?

A commonly found definition of cloud computing is:  


A set of disciplines, technologies, and business models used to render IT capabilities as on-demand services.

A frequently asked question is about the origin of the term ‘cloud’. In most documents related to the internet it is common practice to represent the internet as a diagrammatic representation of a cloud, due to the distributed nature of internet. Cloud computing also has a similar distributed nature and hence the term ‘cloud’ was adopted.

Cloud computing is also often referred to as ‘the cloud’.

The common characteristics of cloud computing includes:

(a) Shared Infrastructure: As per the cloud business model, the cloud service provider invests in infrastructure necessary to provide software, platforms and related infrastructure, as a service to multiple consumers. Hence the service providers have a financial incentive to leverage the infrastructure across as many consumers as possible. 


(b) On-demand self-service: On-demand self-service is the cloud customer’s ability to purchase and use cloud services as per need. For example, as the number of users supported by the customer's application increases, the customer can add more storage space or processing power as per need. When the enhanced computing power/storage is no longer needed, the customer can scale down as well. Thus the cloud computing’s ability to quickly provision and deprovision IT services creates an elastic and scalable IT resource. It is a pay-as-you-go model where the customers pay only for the services that they actually use.

As an added advantage, it is also possible for cloud vendors to provide an application programming interface (API) that enables the customer to programmatically (or automatically through a management application) scale-up or scale down cloud services.

(c) Consumption based pricing model: As explained at (b) above, the customers pay only for the services they actually use, resulting in per hour or per GB (Gigabytes) prices. For example, CPU (Central Processing Unit; refers to computing power) time can be billed in minutes or an hour during which the CPU is actually is in use. Data storage can be charged on the basis of GB stored. Data Transfer also can be billed on the basis of MB (Megabytes) or GB. In practice, it is also common for vendors to vary the pricing model for data storage and data transfer based on the geographic proximity of customers to the vendor’s data centers.

2. Advantages of Cloud Computing

Some of the key advantages of cloud computing can be listed as below:

(a) Simplifies and optimizes IT resources: In the current IT scenario, many organizations own and operate all of the IT resources for meeting their business objectives. Such organizations are often forced to install, maintain and upgrade complex solutions integrating different applications, operating systems, servers, networks and storage, to meet ever growing business needs. This drives up the IT operational costs and prevents IT organizations from focusing on strategic business initiatives. This in-house management of IT resources also results in large capital expenditures which return little value to the business. 


In future, as cloud computing gains acceptance, organizations can reduce the size and complexity of internal IT operations by shifting non-strategic, but essential IT resources to a cloud computing platform. Internal IT resources can then focus on more important, higher level projects which can drive core business initiatives.

(b) Cuts costs and moves CAPEX to OPEX: Complex internal IT infrastructures consume a lot of electric power and also need operational personnel to monitor and manage expensive and underutilized IT equipment on a 24x7 basis. Also, in the case of some business scenarios, highly intensive computing power and storage capacity is required only for a few hours or days per month. Capital expenditure (CAPEX) is often more tightly controlled by finance departments than operational expenditures (OPEX). 


Moving to the cloud helps IT organizations release the work-load on their already strained data centers. Cloud computing’s on-demand, consumption based pricing model can help IT organizations defer large capital expenses or even avoid costs altogether. 

Another classic case is the Test Hub that software development companies employ to simulate real-world scenarios. In Test Hubs, the IT resource configurations are often much larger and complex then typical development environments. Cloud computing provides a quick and cost-effective way to boost computing power and data storage to simulate real world scenarios in Test Hubs.

Since cloud computing expenses get classified as operational expenditure there is less budgetary controls as explained above. 

(c) Improved IT Resource Management: IT resource procurement model in typical organizations is often an inefficient supply chain. The procurement cycle starts with System Administrators predicting and factoring usage patterns into buying decisions to ensure sufficient capacity to satisfy growth over time. The procurement process should also allow for contingencies like delayed delivery of equipment, non-working equipment delivered, slow budgetary approvals and poor forecasting. In effect, more resources than is needed are purchased and the operating resources are underutilized. 
 
Cloud computing’s on-demand, pay-as-you-go consumption based procurement model enables IT organizations to efficiently mange their IT resources and ensure better return on investment.

(d) Inexpensive Disaster Recovery: Building data centers with enough redundancy for disaster recovery can be an expensive proposition. Using an out of the region co-location facility is also difficult with out incurring high costs. Hence many organizations have poorly tested or even non-existent disaster recovery plans.

Here again, cloud computing services provides a viable alternative to increase business continuity by disaster planning without incurring the high costs as mentioned above.

This concludes part one of this two part blog post on “Cloud Computing”. In part two of this post, we will look at:

3.Concerns related to cloud computing
4.Factors which can accelerate wide spread adoption of cloud computing

~ Sunish