Cloud computing is officially defined as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction” (Mell & Grance, 2011).
Five fundamental characteristics further define cloud computing: on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service (Mell & Grance, 2011). The on-demand self-service attribute of the cloud computing model dictates that computing capabilities—such as server time and network storage—can be provisioned by a consumer without requiring human interaction with a service provider. Broad network access implies that a cloud provider’s capabilities are available over the network and accessible through standard mechanisms. Through resource pooling, a cloud provider’s resources are pooled in order to serve multiple consumers—often following a multi-tenant model. Rapid elasticity describes cloud capabilities as automatically provisionable and releasable in an effort to rapidly scale outward and inward with current demand. The measured service aspect of cloud systems automatically controls and optimizes resource utilization by leveraging some metering capability.
There exist three basic service models of cloud systems (Mell & Grance, 2011). Software as a Service (SaaS) is characterized by the ability of a consumer to utilize a provider’s application through various client devices and interfaces. Platform as a Service (PaaS) is provided to a consumer as the ability to deploy custom applications onto a platform hosted by a cloud service provider. Infrastructure as a Service (IaaS) allows a consumer to provision a service provider’s individual resources—such as processing, storage, and networking—to meet a specific use case.
Cloud services are deployed using one of four distinct models (Mell & Grance, 2011). Private cloud defines a cloud infrastructure which is provisioned exclusively for a single organization. Community cloud is an infrastructure provided to a specific community of consumers or organizations with shared concerns. Public cloud—the most open model—is available for use by the general public. Hybrid cloud is a composition of two or three of the aforementioned models which, while tied together by technology enabling data and application portability, remain unique entities.
The benefits of portability and ease of use of enterprise services provided by the cloud computing model do not come free of drawbacks. Foremost of these is the issue of security, which is also the primary dilemma hampering greater adoption of cloud computing. In terms of security, there exist several reasons to be careful—sometimes even apprehensive—of adopting a cloud service or model (Morsy, Grundy, & Müller, 2010). Developing enterprise software on a provisioned cloud also involves outsourcing security management to the cloud or service provider, limiting the control a consumer has on security. The multi-tenancy employed in many cloud services implies the existence of shared resources among tenants; a design which can be exploited by malicious users if not adequately secured. Thirdly, the broadly-accessible nature of cloud services increases the probability of attacks on cloud-hosted solutions. Further security complexities in cloud solutions arise from the web of intricate dependencies between various public offerings. For example, SaaS deployed on a PaaS depends on the security offered by said PaaS, which in turn depends on the security settings of the IaaS it itself may be utilizing.
There are a plethora of ways for one to run an enterprise system, but utilizing the power of the cloud has become a popular choice. The reason for this is that cloud services offer elasticity, low upfront investment, a low time to market, and an attractive pay-per-use model(Agrawal et al., 2011). Elasticity is an important trait of the cloud that gives companies the flexibility to scale up or down according to their needs. Most businesses do not see a static flow of traffic year-round, and most websites do not even see static traffic throughout a single day. One major factor that limits the scalability of an enterprise is the flexibility of its storage system. Amazon has solved this issue by rolling out AWS S3, which provides on-demand storage capacity (Marston et al., 2011). Amazon also provides their DynamoDB service which utilizes a NoSQL DBMS instead of a relational DBMS since key-value stores perform better during the deployment of large-scale applications typically found in the cloud (Agrawal et al., 2011). Because of the fact that the aforementioned services come ready to use in a pay-per-use plan, companies do not have to invest as much time and money to get their operations off the ground. This allows them to get their products out into the market quicker, and with performance gains under various different workloads.
The cloud is a very attractive tool in the realm of education. This statement is multi-faceted, as cloud services not only makes a college and its courses more modernized, but it also can potentially reduce the overall expenditures for the school (Sultan, 2010). According to Sultan companies such as Google and Microsoft have been introducing their cloud services to schools in eastern Africa in an attempt to increase the standard of education in these poor nations. Because the services are hosted on servers owned by these large tech companies, the schools in east Africa do not have to spend additional capital outside of the cost of the cloud services. Cloud computing was so popular that the University of Westminster performed a case study where the school dropped their old email infrastructure and migrated their services to Google Apps, now known as G-Suite (Sultan, 2010). The study showed that, as well as improving the user experience for the students, it costed absolutely nothing to use the Education Edition of Google Mail, while setting up an internal mail server with the same storage capacity as Google Mail would cost the university roughly one million pounds.
Another example of cloud utilization in educational institutions is when UC Berkeley adopted cloud services for their SaaS application courses (Sultan, 2010). The university was able to decouple its SaaS courses from an outdated infrastructure, and they moved their operations to the cloud hosted by Amazon Web Services. A main reason for the switch over to a cloud-based infrastructure was that with the scalability of cloud services, it was easy for the school to create a large amount of servers in a very short amount of time to accommodate for the course’s students (Sultan, 2010).
According to Marathe et al., by exploiting redundancy in Amazon EC2 auction market, they were able to reduce expected cost. Amazon EC2 “provides fixed-cost and variable-cost, auction-based points.” The user may bid for storage and obtain access to a node, but if the bid price is exceeded by the market price, the user is stripped of that node without any warning. So, Marathe et al were able to implement algorithms to determine the bid price and execute programs at a lower cost than both using the on-demand market and any non-redundant, auction-market algorithm; this adaptive algorithm was shown to further reduce costs when it was informed with scalability characteristics of applications. The savings were recorded to be up to 56 percent of the expected cost for the original “adaptive algorithm run at a fixed, user-defined scale.” (Marathe et al., 2016)
El Shazly et al. provided an indepth explanation of the Multi-Cloud GenomeKey, which is a package capable of efficient execution of the variant analysis workflow for detecting and annotating mutations while utilizing cloud resources from multiple commercial cloud providers such as: Amazon, Google, Azure Clouds, and any other OpenStack cloud platform. This package can efficiently utilize a cluster of nodes where each node comes from a different cloud. Amazon’s spot instance model is exploited by MC-GenomeKey by using it with other cloud platforms, which provides significant cost reduction. The workflow execution is optimized “using computational resources from different cloud providers.” (El Shazly et al., 2017) Running in different cloud platforms allows the user to use the best offers.
Due to the general capabilities of cloud solutions, industry-specific cloud is a very important progression in order for cloud to grow. Since businesses in differing industries have specific requirements, each requirement is not met with the base provisions, so many commercial cloud providers choose to develop cloud provision to meet the needs of certain industries including: health, education, energy, pharmaceuticals, and many more “by providing ready-made standard software services to deliver specific functionality through services.” (James, Chung, 2015) Because industry-specific cloud provides these services in a manner that is catered to each business, the efficiency of the work done is greatly increased. Although industry-specific cloud is being utilized throughout the domain of industry, many companies have chosen to pass up, for the time being, on these cloud services tailored to their specific needs because of many industrial concerns: security, privacy, control, and interoperability. So, the next step here is the integration of cloud providers and domain experts to overcome these concerns.
According to Ryan, he stated that cloud computing means entrusting our information to systems that are based and managed by external parties that are on remote servers “in the cloud”, but they do raise concerns in terms of privacy and confidentiality. This concern raise awareness because sometimes the service providers could use the information stored in their servers for unauthorized purposes or accidentally lose them. A typical example of this problem is the Conference management systems. Ryan states that a privacy concern with cloud-computing -based CMS(Conference management system) such as EDAS and EasyChair is that administrators to the system are also custodians to the large amount of data. This leads to accidental or deliberate disclosure.
Another concern Ryan expressed was the beneficial data mining. This arise as a mean of abuse of conference data where administrators take advantage of their roles and leak out information to third parties based on what kind of information stored and ends up being beneficial for that party. Ryan proposed a solution through policies and legislation. He proposed the use and articulation of clear policies that would circumscribe the way by which data were being used. He said not only that would do but also added the processing of encrypted data in the cloud along side the policies. He stated that progress had been made in the encryption of systems that which would allow users to send their encrypted data to the cloud and allow service providers to compute and search based on the encrypted data uploaded, that way without giving them the ability of decrypting the data and misusing it. Problems that arise with this kinds of solution as proposed by Ryan is that the techniques needed to perform this kinds of operation is expensive in computation and bandwidth, and also shows little signs of being frequently practised out in the industry.
Hardware-based security initiatives like the Trusted Platform Module, Intel’s Trusted Execution Technology, were all created to allow users who are far away, to trust that the data that had been submitted to a platform is processed according to a clear policy. He showed that these technologies could be used to give privacy guarantees in cloud computing in general, and CMS softwares in particular. Although he proposed that significant research will be need to be conducted in before for a usable system could be developed.
...(download the rest of the essay above)