Even though cloud computing has enabled many organizations to improve their operations, that doesn’t mean they’re using the technology in the same way. The needs and desires of different businesses and industries require different deployment approaches, whether through public, hybrid, or private models. In addition, how users interact with the applications that run on these architectures varies considerably.
Helping aircraft take flight
Brandon Butler, a contributor to CIO, noted that commercial aircraft manufacturer Boeing is merging the capabilities of on-premise virtualized workloads with a public cloud solution to create a hybrid environment. David Nelson, the company’s chief cloud strategist, stated that the applications the organization uses run more efficiently and serve the needs of Boeing much better than an in-house data center.
Hosted on a, one of the tools used by Boeing monitors all the flight patterns of planes around the world. It incorporates both real-time and historical data, which translates to a huge amount of traffic running through the system on a consistent basis. Previously, the application operated through five laptops that were synced together, which required diligent cooling. Nelson stated that there was so much detail and analysis within the digital information that the machines couldn’t efficiently host the program.
One of the most interesting applications Nelson uses takes on-premise Boeing resources and merges them with a publicenvironment. To deliver better assistance to remote mechanics working with their machines, Boeing launched a tool that allows technicians to research materials as well as conduct and verify maintenance and repairs. In addition, Boeing aircraft specialists can contribute to the system.
“It’s seamless to the end user,” said Nelson, as quoted by the news source. “But it provides all the functionality they need.”
Reducing the cost of financial data
According to InformationWeek contributor Andres Rodriguez, about 80 percent of enterprise data consists of files in the form of email, documents, and other unstructured information. The problem many organizations with traditional storage methods face is that synchronizing through conventional virtualization techniques creates a footprint that doubles or even triples — taking up more and more volume over time.
Rodriguez interviewed a chief information officer from a major financial institution who was charged with reducing IT costs by 8 percent every year. After the CIO consulted with his team, he realized the amount of data the company was handling was taking up the majority of the division’s budget. To mitigate this issue, the CIO decided to launch a program that would integrate the object store of a cloud storage solution with that of cloud-integrated storage (CIS) devices.
What this system essentially did was create a process through which files could be transformed into strings of smaller, more manageable data and then transferred to the object store, which replicates the information and allows it to be viewed through several nodes. This procedure is made possible by the presence of a, which allows connected computer servers to interact with each other much more efficiently than if they had to communicate separately.
“By transferring the state of their file systems natively to the object store, these reinvigorated network-attached storage platforms can scale to absorb an unlimited number of files or snapshots regardless of the size of the device,” noted Rodriguez.
What this theory presents is a cost-effective way to manage large stores of digital information. Though different from the techniques used by Boeing, it’s an effective strategy for a financial institution responsible for securing consumer monetary data.
Latest posts by Team GoGrid (see all)
- Big Data is the New Black - August 22, 2014
- Big Data to Assume a Major Role in 2014 Hurricane Season - August 20, 2014
- How Big Data Can Affect the Way We Learn - August 13, 2014