Archive for the ‘Cloud Computing’ Category

 

How to Easily Deploy MongoDB in the Cloud

Monday, February 3rd, 2014 by

GoGrid has just released it’s 1-Button Deploy™ of MongoDB, available to all customers in the US-West-1 data center. This technology makes it easy to deploy either a development or production MongoDB replica set on GoGrid’s high-performance infrastructure. GoGrid’s 1-Button Deploy™ technology combines the capabilities of one of the leading NoSQL databases with our expertise in building high-performance Cloud Servers.

MongoDB is a scalable, high-performance, open source, structured storage system. MongoDB provides JSON-style document-oriented storage with full index support, sharding, sophisticated replication, and compatibility with the MapReduce paradigm. MongoDB focuses on flexibility, power, speed, and ease of use. GoGrid’s 1-Button Deploy™ of MongoDB takes advantage of our SSD Cloud Servers while making it easy to deploy a fully configured replica set.

Why GoGrid Cloud Servers?

SSD Cloud Servers have several high-performance characteristics. They all come with attached SSD storage and large available RAM for the high I/O uses common to MongoDB. MongoDB will attempt to place its working set in memory, so the ability to deploy servers with large available RAM is important. Plus, whenever MongoDB has to write to disk, SSDs provide for a more graceful transition from memory to disk. SSD Cloud Servers use a redundant 10-Gbps public and private network to ensure you have the maximum bandwidth to transfer your data. You can use can GoGrid’s 1-Button Deploy™ to provision either a 3-server development replica set or a 5-server production replica set with Firewall Service enabled.

Development Environments

The smallest recommended size for a development replica set is 3 servers. Although it’s possible to run MongoDB on a single server, you won’t be able to test failover or how a replica set behaves in production. You’ll most likely have a small working set so you won’t need as much RAM, but will still benefit from SSD storage and a fast network.

(more…) «How to Easily Deploy MongoDB in the Cloud»

Big Data Cloud Servers for Hadoop

Monday, January 13th, 2014 by

GoGrid just launched Raw Disk Cloud Servers, the perfect choice for your Hadoop data node. These purpose-built Cloud Servers run on a redundant 10-Gbps network fabric on the latest Intel Ivy Bridge processors. What sets these servers apart, however, is the massive amount of raw storage in JBOD (Just  a Bunch of Disks) configuration. You can deploy up to 45 x 4 TB SAS disks on 1 Cloud Server.

These servers are designed to serve as Hadoop data nodes, which are typically deployed in a JBOD configuration. This setup maximizes available storage space on the server and also aids in performance. There are roughly 2 cores allocated per spindle, giving these servers additional MapReduce processing power. In addition, these disks aren’t a virtual allocation from a larger device. Each volume is actually a dedicated, physical 4 TB hard drive, so you get the full drive per volume with no initial write penalty.

Hadoop in the cloud

Most Hadoop distributions call for a name node supporting several data nodes. GoGrid offers a variety of SSD Cloud Servers that would be perfect for the Hadoop name node. Because they are also on the same 10-Gbps high-performance fabric as the Raw Disk Cloud Servers, SSD servers provide low latency private connectivity to your data nodes. I recommend using at least the X-Large SSD Cloud Server (16 GB RAM), although you may need a larger server, depending on the size of your Hadoop cluster. Because Hadoop stores metadata in memory, you’ll want more RAM if you have a lot of files to process. You can use any size Raw Disk Cloud Server, but you’ll want to deploy at least 3. Also, each Raw Disk Cloud Server has a different allocation of raw disks, which are illustrated in the table below. The Cloud Server in the illustration is the smallest size that has multiple disks per Cloud Server. Hadoop defaults to a replication factor of three, so to protect your data from failure, you’ll want to have at least 3 data nodes to distribute data. Although Hadoop attempts to replica data to different racks, there’s no guarantee that your Cloud Servers will be on different racks.

Note that the example below is for illustrative purposes only and is not representative of a typical Hadoop cluster; for example, most Cloudera and Hortonworks sizing guides start at 8 nodes. These configurations can differ greatly depending on if you intend to use the cluster for development, production, or production with HBase added. This includes the RAM and disk sizes (less of both for development, most likely more for HBase). Plus, if you’re thinking of using these nodes for production, you should consider adding a second name node.

Hadoop-cluster (more…) «Big Data Cloud Servers for Hadoop»

How To Successfully Implement a Big Data Project in 8 Steps

Monday, October 28th, 2013 by

There are countless ways to incorporate Big Data to improve your company’s operations. But the hard truth is that there’s no one-size-fits-all approach when it comes to Big Data. Beyond understanding your infrastructure requirements, you still need to create an implementation plan to understand what each Big Data project will mean to your organization. At a minimum, that plan should include the following 8 steps.

Big-Data-Cloud

Step 1: Gain executive-level sponsorship

Big Data projects need to be proposed and fleshed out. They take time to scope, and without executive sponsorship and a dedicated project team, there’s a good chance they’ll fail.

Step 2: Augment rather than re-build

Start with your existing data warehouse. Your challenge is to identify and prioritize additional data sources and then determine the right hub-and-spoke technology. At this stage, you’ll want to get approval to evaluate a few options until you settle on the appropriate technology for your needs. (more…) «How To Successfully Implement a Big Data Project in 8 Steps»

Report: Companies like Cloud BI over On-Premise Software

Friday, October 4th, 2013 by

The rapid proliferation of cloud and Big Data projects has led to the convergence of the two strategies in several ways. Currently, business intelligence (BI) processes are steadily moving to the cloud because many professionals involved in the analytic procedure believe cloud-based initiatives are easier to deploy and use and provide more substantial results.

Report: Companies like cloud BI over on-premise software

Report: Companies like cloud BI over on-premise software

These findings were highlighted in a recent Dimensional Research study of more than 400 BI experts in which 80 percent of respondents using cloud services said they were either “very satisfied” or “satisfied” with the tools at their disposal. Meanwhile, only 51 percent of premise-based BI users expressed the same gratitude.

Although there are many benefits associated with using cloud-enabled business intelligence solutions, 83 percent of respondents said the ability to implement the tools faster than their on-premise counterparts was one of the most significant advantages, Dimensional Research reported. Cloud projects can often be deployed in less than 3 months, compared to on-site initiatives, which tend to last more than 6 months.

“These results confirm that cloud BI solutions are strongly preferred, offering faster, easier, and more economical deployment and lower total cost of ownership, along with superior ease of use, which drives broader user adoption within the organization,” said Diane Hagglund, senior research analyst at Dimensional Research.

Cloud tools are easier to use
For the most part, companies are more willing to leverage cloud computing tools for analytic processes than conventional in-house services. In fact, the study found that more than half of respondents would select a cloud-based initiative over a premise-based strategy, and only 14 percent said they would choose on-premise BI software over a hosted environment. This preference is partially because organizations using the cloud experience higher adoption levels: the majority of respondents said more employees have the ability to access and use cloud BI solutions, compared to only 17 percent of BI professionals who perceived traditional offerings as more user friendly.

(more…) «Report: Companies like Cloud BI over On-Premise Software»

Big Data-Oriented Cloud Projects Improve Bottom Line

Thursday, September 5th, 2013 by

In the past several years, the cloud computing landscape has grown exponentially as decision-makers throughout the business world deploy the tools to experience a broad range of benefits. Although the cloud promises to eliminate unnecessary costs and optimize performance, executives may sometimes launch the technology with too broad a scope, reducing its final impact.

Big data-oriented cloud projects improve bottom line

Big Data-oriented cloud projects improve bottom line

Gartner recently highlighted how companies should address this problem by implementing the cloud to alleviate specific issues. Rather than deploying cloud services to improve operations across multiple departments, decision-makers should approach the managed services with a specific target in mind because doing so will likely open up more doors and improve bottom-line functionality.

Analysts noted that 80 percent of enterprises intend to leverage the cloud in some way within the next 12 months.

“Cloud computing is set to have a considerable impact on business in the future which is reflected in the survey finding that around 60 percent of organizations plan increased investment over the next two years to five years, while only 6 percent plan to decrease investments in cloud services,” said Gregor Petri, research director at Gartner.

Although there are many roads decision-makers can take when approaching the cloud, following the one that leads to more effective use of Big Data will have one of the most substantial returns.

(more…) «Big Data-Oriented Cloud Projects Improve Bottom Line»