Public Cloud Appealing to Those Needing Disaster Recovery

May 9th, 2014 by - 1,210 views

These days, businesses are aggregating an incredible amount of data from a lot of different silos. Whether they’re using the information to create enhanced marketing campaigns, conduct research for product development, or look for a competitive edge in the market, these companies are taking whatever steps are necessary to protect that data. Between data breaches and natural occurrences like severe weather that can cause companies to lose their data, many are moving their disaster recovery initiatives to cloud servers.

A broken disk.

A broken disk.

A practical solution
One of the most popular deployment options, public cloud models offer companies the opportunity to back up their data in encrypted, secure environments that can be accessed whenever it’s convenient. However, businesses are looking to take this capability to the next level. Redmond Channel Partner referenced a study sponsored by Microsoft titled “Cloud Backup and Disaster Recovery Meets Next-Generation Database Demands,” which was conducted between December 2013 and February 2014 by Forrester Consulting.

The research firm polled 209 organizations based in Asia, Europe, and North America, with 62 percent of survey participants consisting of large-scale enterprise IT managers. Many of the businesses reported having mission-critical databases larger than 10 terabytes. Respondents claimed that some of the top reasons for using public cloud computing models for backups included saving money on storage (61 percent) and reducing administration expenses (50 percent).

Forrester noted that a fair number of enterprises often omit encrypting their database backups due to the complexity involved and the possibility of data corruption. A number of participants also acknowledged they neglect to conduct tests regarding their disaster recovery capabilities.

The available opportunities
Despite these drawbacks, Forrester’s study showed that cloud-based backup and disaster recovery (DR) models have matured over the past 4 years. In addition, there’s the option of using a hybrid approach that involves combining on-premise DR solutions with public cloud storage. For example, an enterprise could keep all its data in in-house databases and orchestrate a system that would either duplicate or transfer all data into a cloud storage environment in the event of a problem.

Read the rest of this entry » «Public Cloud Appealing to Those Needing Disaster Recovery»

An “Expert Interview” with Kole Hicks, Sr. Director of Products at GoGrid

May 8th, 2014 by - 1,954 views

If you’re interested in learning the answers to many common Big Data implementation questions  Syncsort, a leading provider of Big Data integration solutions, recently posted an interesting blog with our very own Kole Hicks, Sr. Director of Products. In this interview, blogger Mark Underwood proposes several key questions to consider when beginning a Big Data project, starting with “What are the biggest obstacles?” and going all the way to “What are the in-house requirements for Big Data?”


Check out the complete interview by clicking here.

And of course if you’re interested in a Big Data solutions integrator, the combination of Syncsort and GoGrid infrastructure might just be an ideal way to get you up and running with the push of a button!

You can learn more about Syncsort on its website.

FBI: Health Care Providers Need to Improve Security

May 6th, 2014 by - 1,598 views

There’s no disputing that upon implementing cloud servers, physicians, nurses, and hospital administrators will be able to store and access patient information more easily than before. Although such an approach enables them to develop treatments for specific customers, IT professionals and government officials believe care facilities need to improve their security before progressing to the cloud.

Nurses and doctors accessing information.

Nurses and doctors accessing patient information.

A number of cloud solutions offer expanded data protection; however, the current state of many electronic health records systems is lackluster, at best. Data flowing between hospital PCs and mobile devices opens new avenues — creating an environment hackers could potentially exploit to steal sensitive personal health information.

An official security warning 
According to Reuters, the Federal Bureau of Investigation recently informed health care providers their cyber-security infrastructures were unsatisfactory compared to other industries. Although cyber criminals have been known to attack the retail and financial sectors, they could also use electronic records containing insurance and payment information to gain access to bank accounts, personal addresses, phone numbers, and other data.

Reuters obtained a private notice sent to hospital administrators criticizing their lax network defense programs. Issued earlier this month, the memo did not mention the breach, which has been criticized by professionals for numerous security flaws. It further implored recipients to contact the FBI in the event any breaches occurred.

The source stated that criminals typically favor health care information because it takes longer for victims to realize that any intelligence has been stolen. Although they often don’t leverage the information itself, hackers often sell such data on the black market. To deter infiltration attempts, some hospitals have invested in cloud infrastructure featuring applications that encrypt data as it flows through the networks.

Read the rest of this entry » «FBI: Health Care Providers Need to Improve Security»

Public Cloud Deployments Will Improve Through Competition

May 2nd, 2014 by - 997 views

The world of cloud computing is undergoing a monumental shift. Competition across private, hybrid, and public cloud solution providers has been heating up thanks to new innovations and decreasing price changes. As the rate of adoption is becoming increasingly more affordable, businesses on the small and midsize level are looking to capitalize on scalable storage space and flexible communications.

Business employees access files and applications stored on public cloud architectures

Employees access files and applications stored on public cloud architectures.

Anything you can do, I can do cheaper 
Pedro Hernandez, a contributor to TechWeek, noted that Microsoft is making good on its promise to match public cloud prices set by Amazon, reducing computing expenses by 35 percent and storage by about 65 percent. At the commencement of CEO Satya Nadella’s placement as the new leader of the corporation, the professional stated that he will spearhead a “cloud first, mobile first” business plan to integrate all of Microsoft’s products so that they work in a more compatible manner.

Steven Martin, general manager for Microsoft Azure, noted that the economics side of the cloud business is certainly a major factor in the cloud storage market, but it doesn’t necessarily guarantee a profitable result. The executive claimed  Microsoft plans on investing heavily in research and development, looking for new approaches and infrastructure designs that will deliver a more secure, operable public cloud framework. In addition, the company expressed interest in searching for new partnerships in an effort to gain outside insight into an increasingly competitive market.

Getting down to specifics, the cost reductions of Microsoft’s cloud servers will be organized around two models. “Standard” will be defined as general-purpose virtual machines and won’t offer load balancing or auto-scaling. Although this change will result in a 35 percent price contraction, industry critics have speculated about whether the company may be sacrificing quality. The second deployment will refocus on storage expenses, scaling down costs for locally redundant storage by 65 percent.

IT departments feeling the heat 
Amid the fluctuating marketplace, IT teams for large and midsize companies are left wondering where they should start. Many organizations encountering a high volume of data traffic often require a public environment capable of handling it all. The requirements don’t stop there, either. Employees are continuing to use mobile devices to access company documents and information so they can work on-the-go and out of the office more frequently than before.

Read the rest of this entry » «Public Cloud Deployments Will Improve Through Competition»

HBase Made Simple

April 30th, 2014 by - 1,508 views

GoGrid has just released its 1-Button Deploy™ of HBase, available to all customers in the US-West-1 data center. This technology makes it easy to deploy either a development or production HBase cluster on GoGrid’s high-performance infrastructure. GoGrid’s 1-Button Deploy™ technology combines the capabilities of one of the leading NoSQL databases with our expertise in building high-performance Cloud Servers.

HBase is a scalable, high-performance, open-source database. HBase is often called the Hadoop distributed database – it leverages the Hadoop framework but adds several capabilities such as real-time queries and the ability to organize data into a table-like structure. GoGrid’s 1-Button Deploy™ of HBase takes advantage of our SSD and Raw Disk Cloud Servers while making it easy to deploy a fully configured cluster. GoGrid deploys the latest Hortonworks’ distribution of HBase on Hadoop 2.0. If you’ve ever tried to deploy HBase or Hadoop yourself, you know it can be challenging. GoGrid’s 1-button Deploy™ does all the heavy lifting and applies all the recommended configurations to ensure a smooth path to deployment.

Why GoGrid Cloud Servers?

SSD Cloud Servers have several high-performance characteristics. They all come with attached SSD storage and large available RAM for the high I/O uses common to HBase. The Name Nodes benefit from the large RAM options available on SSD Cloud Servers and the Data Nodes use our Raw Disk Cloud Servers, which are configured as JBOD (Just a Bunch of Disks). This is the recommended disk configuration for Data Nodes, and GoGrid is one of the first providers to offer this configuration in a Cloud Server. Both SSD and Raw Disk Cloud Servers use a redundant 10-Gbps public and private network to ensure you have the maximum bandwidth to transfer your data. Plus, the cloud makes it easy to add more Data Nodes to your cluster as needed. You can use GoGrid’s 1-Button Deploy™ to provision either a 5-server development cluster or an 11-server production cluster with Firewall Service enabled.

Development Environments

The smallest recommended size for a development cluster is 5 servers. Although it’s possible to run HBase on a single server, you won’t be able to test failover or how data is replicated across nodes. You’ll most likely have a small database so you won’t need as much RAM, but will still benefit from SSD storage and a fast network. The Data Nodes use Raw Disk Cloud Servers and are configured with a replication factor of 3.

Read the rest of this entry » «HBase Made Simple»