KML_FLASHEMBED_PROCESS_SCRIPT_CALLS

Archive for the ‘GoGrid’ Category

 

How to Easily Deploy MongoDB in the Cloud

Monday, February 3rd, 2014 by

GoGrid has just released its 1-Button Deploy™ of MongoDB, available to all customers in the US-West-1 data center. This technology makes it easy to deploy either a development or production MongoDB replica set on GoGrid’s high-performance infrastructure. GoGrid’s 1-Button Deploy™ technology combines the capabilities of one of the leading NoSQL databases with our expertise in building high-performance Cloud Servers.

MongoDB is a scalable, high-performance, open source, structured storage system. MongoDB provides JSON-style document-oriented storage with full index support, sharding, sophisticated replication, and compatibility with the MapReduce paradigm. MongoDB focuses on flexibility, power, speed, and ease of use. GoGrid’s 1-Button Deploy™ of MongoDB takes advantage of our SSD Cloud Servers while making it easy to deploy a fully configured replica set.

Why GoGrid Cloud Servers?

SSD Cloud Servers have several high-performance characteristics. They all come with attached SSD storage and large available RAM for the high I/O uses common to MongoDB. MongoDB will attempt to place its working set in memory, so the ability to deploy servers with large available RAM is important. Plus, whenever MongoDB has to write to disk, SSDs provide for a more graceful transition from memory to disk. SSD Cloud Servers use a redundant 10-Gbps public and private network to ensure you have the maximum bandwidth to transfer your data. You can use can GoGrid’s 1-Button Deploy™ to provision either a 3-server development replica set or a 5-server production replica set with Firewall Service enabled.

Development Environments

The smallest recommended size for a development replica set is 3 servers. Although it’s possible to run MongoDB on a single server, you won’t be able to test failover or how a replica set behaves in production. You’ll most likely have a small working set so you won’t need as much RAM, but will still benefit from SSD storage and a fast network.

(more…) «How to Easily Deploy MongoDB in the Cloud»

How to Deploy a Riak Cluster in 5 Minutes on GoGrid

Friday, January 31st, 2014 by

The first big challenge to overcome with any new NoSQL database deployment is figuring out how to deploy the cluster in an environment that lets you scale as needed within a single data center and even across multiple data centers. To save cash, many customers make the mistake of trialing the product on cheap hardware with limited RAM across clusters that are inadequate for the application.

We think there’s a better way to run your evaluation. At GoGrid, we’ve made it possible to deploy a 5-node Riak cluster on beefy, high-performance machines with the click of a button. Check out the specs we’re providing as an orchestrated deployment using our 1-Button Deploy™ technology:

  • 5 nodes
  • 16 GB RAM per node
  • 16 cores per node
  • 640 GB storage per node
  • 10-Gbps network
  • 40-Gbps private network connectivity to additional Block Storage volumes (as needed)

Once the first cluster is deployed, you can point-and-click to add more nodes as you need them. Geek out for a moment on what you can do with this technology: You can run a user/session store for your application, use it to target and serve advertising, perform MapReduce operations, or any number of other things with just a few clicks of the mouse. And you can do it all in 4 easy steps.

Step 1: Login to GoGrid

To get started, login to your GoGrid account at https://my.gogrid.com to access the management console. If you don’t yet have an account, go ahead and create one: visit www.gogrid.com and click the Get Started button in the upper right-hand corner of the screen.

Step 2: Add New Infrastructure

(more…) «How to Deploy a Riak Cluster in 5 Minutes on GoGrid»

Infographic: Keep your patient health info secure in the cloud

Wednesday, January 22nd, 2014 by

Maintaining data security in the healthcare sector is hard. Although all businesses worry about securing confidential data, it doesn’t compare to the burden of companies managing personal health information that must comply with the Healthcare Insurance Portability and Accountability Act (HIPAA) and other relevant regulations. Unfortunately, the sensitive nature of these assets makes them even more desirable to cybercriminals. The result: Patient health information is being targeted more frequently and more aggressively than ever before. Fortunately, the evolving IT landscape has provided a way to address these threats: proactive security monitoring to identify and mitigate potential risks and encryption to protect the data itself.

Outside attacks are only one aspect of the problem, however: Negligent insiders are also putting their organizations at risk. Studies have shown that roughly 94% of healthcare firms have experienced at least 1 data breach within the past 2 years. Because these incidents cost the industry upwards of $7 billion per year, administrators must proactively seek strategies that cut down the chances of unwanted security problems.

Financial repercussions of a data breach

Due to the regulations governing personal health information, the reputation damage and bottom-line costs of a data breach are often exacerbated by compliance fines. What is more troubling is that these costs are only increasing in frequency and severity. Experts believe that the financial repercussions of data breaches have increased by $400,000 between 2010 and 2012, with more than half of companies losing $500,000 or more in 2012. With the price tag expected to rise 10 percent year-over-year through 2016, businesses must plan ahead to reduce these challenges.

To illustrate the effect of data breaches on healthcare organizations and the magnitude of the response required, we’ve put together the following infographic, “Keep Your Patient Health Info Secure in the Cloud.” Part of our series of 60-second guides, the graphic will show you in only a minute why the cloud is powering new ways to secure some of the most personal information available: details about our health.

GoGrid_HIPAA_Compliance_72_F

(more…) «Infographic: Keep your patient health info secure in the cloud»

Does it take a village to ensure security (or just hard work)?

Monday, January 6th, 2014 by

I watched an interview this morning where Snapchat’s CEO was discussing the recent exposure of its users’ phone numbers and names and something he said stood out for me: “Tech businesses are susceptible to hacking attacks. You have to work really, really, really hard with law enforcement, security experts, and various external and internal groups to make sure that you’re addressing security concerns.”

image

I have to agree with him: It takes a lot of effort to keep up with the latest security threats and vulnerabilities, to continuously assess existing security safeguards, to open channels of communications with security peers in other organizations, and to work with local and federal law enforcement to solve common security problems. Even companies that spend millions on security like Target are clearly challenged every day to identify and remove vulnerabilities to protect their customers’ data.

The rapid growth of cloud services and cloud service providers has only added new areas of concern for organizations hoping to leverage the benefits of the cloud. Organizations must perform their due diligence in identifying the right cloud service provider for their needs—preferably one that’s had time to develop security best practices based on firsthand experience and hard-won expertise. Securing a company’s production environment requires a cloud partner that is mature and has dedicated resources to provide robust security services and products.

Consider the recent DigitalOcean security revelation that its customers can view data from a VM previously used by another customer. According to one reporter, a DigitalOcean customer “noted that DigitalOcean was not by default scrubbing user’s data from its hard drives after a virtual machine instance was deleted.” Why not? DigitalOcean confided that the deletes were taking too long to complete and resulted in potential performance degradation of its services.

I recognize that challenge because GoGrid addressed this same issue years ago. All our deleted VMs go through an automated secure scrubbing process that ensures a previous customer’s data isn’t inadvertently shared with a new customer—and we do so without impacting our production environment. Was that easy to accomplish? No, it wasn’t. In fact, it took a lot of engineering work and resources to develop the right way to secure our customers’ data without impacting performance. Taking technical shortcuts when it comes to security often results in unexpected consequences that can affect an organization’s overall security—and ultimately, its reputation.

(more…) «Does it take a village to ensure security (or just hard work)?»

Implementing Big Data in the Cloud: 3 Pitfalls that Could Cost You Your Job

Monday, November 25th, 2013 by

In IT departments around the globe, CTOs, CIOs, and CEOs are asking the same question: “How can we use Big Data technologies to improve our platform operations?” Your particular role could be responsible for solving for a wide variety of use cases ranging from real-time monitoring and alerting to platform operations analysis or behavioral targeting and marketing operations. The solutions for each of these use cases vary widely as well. But no matter which Big Data solution you choose, make sure you avoid the following 3 pitfalls.

Pitfall #1: Assuming a single solution fits all use cases

In a recent post, Liam Eagle of 451 Research looked at GoGrid’s Big Data product set, which is purpose-built for handling different types of workloads. He noted that variety is the key here. There isn’t a single one-size-fits-all solution for all your use cases. At GoGrid, for example, many of our Big Data customers are using 3 to 5 solutions, depending on their use case, and their platform infrastructure typically spans a mix of cloud and dedicated servers running on a single VLAN. So when you’re evaluating solutions, it makes sense to try out a few, run some tests, and ensure you have the right solution for your particular workload. It’s easy for an executive to tell you, “I want to use Hadoop,” but it’s your job that’s on the line if Hadoop doesn’t meet your specific needs.

image

As I’m sure you already know, Big Data isn’t just about Hadoop. For starters, let’s talk about NoSQL solutions. The following table lays out a few options and their associated use cases to help illustrate the point.

Solution Common Use Cases Pros and Cons
Cassandra (more…) «Implementing Big Data in the Cloud: 3 Pitfalls that Could Cost You Your Job»