Archive for November, 2013

 

Implementing Big Data in the Cloud: 3 Pitfalls that Could Cost You Your Job

Monday, November 25th, 2013 by

In IT departments around the globe, CTOs, CIOs, and CEOs are asking the same question: “How can we use Big Data technologies to improve our platform operations?” Your particular role could be responsible for solving for a wide variety of use cases ranging from real-time monitoring and alerting to platform operations analysis or behavioral targeting and marketing operations. The solutions for each of these use cases vary widely as well. But no matter which Big Data solution you choose, make sure you avoid the following 3 pitfalls.

Pitfall #1: Assuming a single solution fits all use cases

In a recent post, Liam Eagle of 451 Research looked at GoGrid’s Big Data product set, which is purpose-built for handling different types of workloads. He noted that variety is the key here. There isn’t a single one-size-fits-all solution for all your use cases. At GoGrid, for example, many of our Big Data customers are using 3 to 5 solutions, depending on their use case, and their platform infrastructure typically spans a mix of cloud and dedicated servers running on a single VLAN. So when you’re evaluating solutions, it makes sense to try out a few, run some tests, and ensure you have the right solution for your particular workload. It’s easy for an executive to tell you, “I want to use Hadoop,” but it’s your job that’s on the line if Hadoop doesn’t meet your specific needs.

image

As I’m sure you already know, Big Data isn’t just about Hadoop. For starters, let’s talk about NoSQL solutions. The following table lays out a few options and their associated use cases to help illustrate the point.

Solution Common Use Cases Pros and Cons
Cassandra (more…) «Implementing Big Data in the Cloud: 3 Pitfalls that Could Cost You Your Job»

How To Deploy an SSD Cloud Server in 5 Minutes on GoGrid

Wednesday, November 20th, 2013 by

GoGrid’s solid state disk (SSD) Cloud Servers are the next evolution in cloud servers. With 10 Gbps public and private network connectivity, RAM allocations of up to 64 GB, 40 cores, up to 2,199 GB of persistent storage, plus the ability to provision up to 12 TB block volumes, these SSDs are designed to solve for most high I/O applications. You can find out more about our SSD Cloud Servers in a video we posted on YouTube.

In this post I’ll walk you through the basics of deploying an SSD Cloud Server on GoGrid. Like everything we deliver as a service, deploying SSD Cloud Servers through the GoGrid management console is simple and easy. Let’s get started.

I begin by logging into my.gogrid.com with my user name and password. If you don’t have one, just sign up for a GoGrid account by clicking here.

image

Once in the management console I’ll select the +add button to begin setting up my server deployment. By selecting +add , I can deploy a variety of compute, network, and storage options.

image

(more…) «How To Deploy an SSD Cloud Server in 5 Minutes on GoGrid»

Practical Big Data Use in Marketing Delivers Big Returns

Monday, November 18th, 2013 by

Although organizations around the world continue to have mixed feelings about Big Data, the truth is that properly planned, launched, and managed programs will deliver significant benefits to those willing to follow through with comprehensive strategies. Doing so will require diligence on behalf of both decision-makers and employees, however, as well as a strong sense of collaboration between departments, teams, and company-wide goals.

Practical big data use in marketing delivers big returns

Practical Big Data use in marketing delivers big returns

The bottom line is that Big Data initiatives are effective for those who are adamant enough to experience the advantages. This conclusion was highlighted in a recent Rocket Fuel and Forbes Insights study of more than 200 senior executives, which found that roughly 60 percent of businesses that use Big Data at least 50 percent of the time have exceeded their initial goals for the strategies. Conversely, only about a third of companies that don’t use Big Data frequently were able to meet their expectations.

The moral of this story is that planning ahead and making effective use of available data will give companies an edge in the long run. This is especially true for marketing organizations that rely on information to build customized strategies to draw in, cultivate, and retain prospective customers.

Interestingly, the study found that organizations often have conflicting ideas of how they are using Big Data and the effects that are being introduced because of these perceptions. The survey revealed that the majority of marketing agencies said they are frequently or always taking full advantage of data within advertising processes, although only about 10 percent of companies manage more than half of their promotional endeavors with Big Data.

Bringing Big Data into marketing
Information is the basis for all effective decision-making, especially in marketing. If organizations blindly adopt promotional strategies without first assessing the landscape and how those endeavors will function, they risk not only failing in their attempt to acquire new customers, but possibly losing existing ones who find such attempts unappealing.

(more…) «Practical Big Data Use in Marketing Delivers Big Returns»

Speed, Accuracy More Important than Volume for Big Data

Friday, November 15th, 2013 by

Although information has always had an important role in the business world, the Big Data movement has placed a larger emphasis on the aggregation and management of digital resources. Companies around the world are now seeking programs that help employees collect and store increasingly larger volumes of information because most decision-makers believe the chances of striking gold will increase if they have more data to analyze.

Speed, accuracy more important than volume for big data

Speed, accuracy more important than volume for Big Data

Ironically, organizations that believe in collecting massive amounts of information often find themselves inundated by the sheer size of the resources under their control and, as a result, encounter new performance obstacles that lead to substantial issues. This conundrum is forcing executives to ask, “Is bigger always better?”

The answer is no, especially in today’s Big Data world. Rather than focusing on the quantity of information they collect, organizations should focus on the quality. Because there’s no finite definition for “Big Data,” each company is forced to develop its own innovative strategy that aligns with the way employees work on a daily basis. In many cases, speed will outweigh a number of other metrics associated with the information being collected. If an organization has the power to quickly convert unstructured data into useful insight for its marketing and sales teams, for example, it will likely be able to reduce client churn and improve its ability to attract, engage, and retain customers.

Still, many experts believe that the data with the largest business impact is the most difficult to measure. The need to do so will pressure executives to build robust warehousing environments that can make sense of various information sets as quickly as possible, allowing firms of all sizes to use a broad range of information for numerous practices, improving their ability to compete and prosper in the long run.

Quality over quantity
Of course it will become increasingly difficult to find a needle if the haystack continues to grow in size. This prospect should encourage executives to build programs that regularly maintain the haystack, trimming the unnecessary heaps of hay to make sifting through the pile less complex. In terms of data management, this means that organizations need to understand the complications and opportunity costs associated with the inability to keep information storage environments in check.

(more…) «Speed, Accuracy More Important than Volume for Big Data»

Big Data = Big Confusion? Hint: The Key is Open Data Services

Wednesday, November 6th, 2013 by

When folks refer to “Big Data” these days, what is everyone really talking about? For several years now, Big Data has been THE buzzword used in conjunction with just about every technology issue imaginable. The reality, however, is that Big Data isn’t an abstract concept. Whether you like it or not, you’re already inundated with Big Data. How you source it, what insights you derive from it, and how quickly you act on it will play a major role in determining the course—and success—of your company. Let’s talk specifics…

Big-Data-3Vs

Handling the increased volume, variety and velocity (the “3V/s”) of data requires a fundamental shift in the makeup of the platform required to capture, store, and analyze the data. A platform that’s capable of handling and capitalizing on Big Data successfully requires a mix of structured data-handling relational databases, unstructured data-handling NoSQL databases, caching solutions, and map reducing Hadoop-style tools.

As the need for new technologies to handle the “3V/s” of Big Data has grown, open source solutions have become the catalysts for innovation, generating a steady launch of new, relevant products to tackle Big Data challenges. Thanks to the skyrocketing pace of innovation in specialized databases and applications, businesses can now choose from a variety of proprietary and open source solutions, depending on the database type and their specific database requirements.

Given the wide variety of new and complex solutions, however, it’s no surprise that a recent survey of IT professionals showed that more than 55% of Big Data projects fail to achieve their goals. The most significant challenge cited was a lack of understanding of and the ability to pilot the range of technologies on the market. This challenge systematically pushes companies toward a limited set of proprietary platforms that often reduce the choice down to a single technology. Perpetuating the tendency to seek one cure-all technology solution is no longer a realistic strategy. No single technology such as a database can solve every problem, especially when it comes to Big Data. Even if such a unique solution could serve multiple needs, successful companies are always trialing new solutions in the quest to perpetually innovate and thereby achieve (or maintain) a competitive edge.

Open Data Services and Big Data go hand-in-hand

(more…) «Big Data = Big Confusion? Hint: The Key is Open Data Services»