KML_FLASHEMBED_PROCESS_SCRIPT_CALLS

Archive for the ‘GoGrid’ Category

 

Infographic: Big Data or Big Confusion? The Key is Open Data Services

Tuesday, July 22nd, 2014 by

When folks refer to “Big Data” these days, what is everyone really talking about? For several years now, Big Data has been THE buzzword used in conjunction with just about every technology issue imaginable. The reality, however, is that Big Data isn’t an abstract concept. Whether you like it or not, you’re already inundated with Big Data. How you source it, what insights you derive from it, and how quickly you act on it will play a major role in determining the course—and success—of your company. To help you get started understanding the key Big Data trends, take a look at this infographic: “60-Second Guide to Big Data and the Cloud.”

GoGrid_BigData_revised_300

Handling the increased volume, variety, and velocity—the “3V/s”—of data (shown in the center of the infographic) requires a fundamental shift in the makeup of the platform required to capture, store, and analyze the data. A platform that’s capable of handling and capitalizing on Big Data successfully requires a mix of structured data-handling relational databases, unstructured data-handling NoSQL databases, caching solutions, and map reducing Hadoop-style tools.

As the need for new technologies to handle the “3V/s” of Big Data has grown, open source solutions have become the catalysts for innovation, generating a steady launch of new, relevant products to tackle Big Data challenges. Thanks to the skyrocketing pace of innovation in specialized databases and applications, businesses can now choose from a variety of proprietary and open source solutions, depending on the database type and their specific database requirements.

Given the wide variety of new and complex solutions, however, it’s no surprise that a recent survey of IT professionals showed that more than 55% of Big Data projects fail to achieve their goals. The most significant challenge cited was a lack of understanding of and the ability to pilot the range of technologies on the market. This challenge systematically pushes companies toward a limited set of proprietary platforms that often reduce the choice down to a single technology. Perpetuating the tendency to seek one cure-all technology solution is no longer a realistic strategy. No single technology such as a database can solve every problem, especially when it comes to Big Data. Even if such a unique solution could serve multiple needs, successful companies are always trialing new solutions in the quest to perpetually innovate and thereby achieve (or maintain) a competitive edge.

Open Data Services and Big Data go hand-in-hand

(more…) «Infographic: Big Data or Big Confusion? The Key is Open Data Services»

Architecting for High Availability in the Cloud

Tuesday, July 22nd, 2014 by

An introduction to multi-cloud distributed application architecture

In this blog, we’ll explore how to architect a highly available (HA) distributed application in the cloud. For those new to the concept of high availability, I’m referring to the availability of the application cluster as well as the ability to failover or scale as needed. The ability to failover or scale out horizontally to meet demand ensures the application is highly available. Examples of applications that benefit from HA architectures are databases applications, file-sharing networks, social applications, health monitoring applications, and eCommerce websites. So, where do you start? The easiest way to understand the concepts is simply to walk through the 3 steps of a web application setup in the cloud.

Step 1: Setting up a distributed, fault-tolerant web application architecture

In general, the application architecture can be pretty simple: perhaps just a load-balanced web front end running on multiple servers and maybe a NoSQL database like Cassandra. When you’re developing, you can get away with a single server, but once you move into production you’ll want to snapshot your web front end and spread the application across multiple servers. This approach lets you balance traffic and scale out the web front end as needed. In GoGrid, you can do this for free using our Dynamic Load Balancers. Point and click to provision the servers as needed, and then point the load balancer(s) to those servers. The process is simple, so setting up a load-balanced web front end should only take a few minutes. Any data captured or used by the servers will of course be stored in the Cassandra cluster, which is already designed to be HA.

image

Deploying the Cassandra cluster. In GoGrid, you can use our 1-Button Deploy™ technology to set up the Cassandra cluster in about 10 minutes. This will provision the cluster for your database. Cassandra is built to be HA so if one server fails, the load is distributed across the cluster and your application isn’t impacted. Below is a sample Cassandra cluster. A minimal deployment has 3 nodes to ensure HA and the cluster is connected via the private VLAN. It’s a good idea to firewall the database servers and eliminate connectivity to the public VLAN. With our production 1-Button Deploy™ solution, the cluster is configured to include a firewall on-demand (for free). In another blog post I’ll discuss how to secure the entire environment: setting up firewalls around your database and your web application as well as working with IDS and IPS monitoring tools and DDoS mitigation services. For the moment, however, your database and web application clusters would look something like this:

image

(more…) «Architecting for High Availability in the Cloud»

High RAM Cloud Servers for Distributed Caching

Tuesday, June 10th, 2014 by

GoGrid has just released High RAM Cloud Servers on our high-performance fabric. These servers are designed to provide a high amount of available RAM that is most commonly required for caching servers. Like our other recent product releases, these servers are all built on our redundant 10-Gbps public and private network.

High RAM Cloud Servers are available in the following configurations:

High RAM RAM Cores SSD Storage
X-Large 16 GB 4 40 GB
2X-Large 32 GB 8 40 GB
4X-Large 64 GB 16 40 GB
8X-Large 128 GB 28 40 GB
16X-Large 256 GB 40 40 GB

 

 

 

(more…) «High RAM Cloud Servers for Distributed Caching»

An “Expert Interview” with Kole Hicks, Sr. Director of Products at GoGrid

Thursday, May 8th, 2014 by

If you’re interested in learning the answers to many common Big Data implementation questions  Syncsort, a leading provider of Big Data integration solutions, recently posted an interesting blog with our very own Kole Hicks, Sr. Director of Products. In this interview, blogger Mark Underwood proposes several key questions to consider when beginning a Big Data project, starting with “What are the biggest obstacles?” and going all the way to “What are the in-house requirements for Big Data?”

Syncsort-blog

Check out the complete interview by clicking here.

And of course if you’re interested in a Big Data solutions integrator, the combination of Syncsort and GoGrid infrastructure might just be an ideal way to get you up and running with the push of a button!

You can learn more about Syncsort on its website.

FBI: Health Care Providers Need to Improve Security

Tuesday, May 6th, 2014 by

There’s no disputing that upon implementing cloud servers, physicians, nurses, and hospital administrators will be able to store and access patient information more easily than before. Although such an approach enables them to develop treatments for specific customers, IT professionals and government officials believe care facilities need to improve their security before progressing to the cloud.

Nurses and doctors accessing information.

Nurses and doctors accessing patient information.

A number of cloud solutions offer expanded data protection; however, the current state of many electronic health records systems is lackluster, at best. Data flowing between hospital PCs and mobile devices opens new avenues — creating an environment hackers could potentially exploit to steal sensitive personal health information.

An official security warning 
According to Reuters, the Federal Bureau of Investigation recently informed health care providers their cyber-security infrastructures were unsatisfactory compared to other industries. Although cyber criminals have been known to attack the retail and financial sectors, they could also use electronic records containing insurance and payment information to gain access to bank accounts, personal addresses, phone numbers, and other data.

Reuters obtained a private notice sent to hospital administrators criticizing their lax network defense programs. Issued earlier this month, the memo did not mention the Healthcare.gov breach, which has been criticized by professionals for numerous security flaws. It further implored recipients to contact the FBI in the event any breaches occurred.

The source stated that criminals typically favor health care information because it takes longer for victims to realize that any intelligence has been stolen. Although they often don’t leverage the information itself, hackers often sell such data on the black market. To deter infiltration attempts, some hospitals have invested in cloud infrastructure featuring applications that encrypt data as it flows through the networks.

(more…) «FBI: Health Care Providers Need to Improve Security»