Archive for 2013

 

Practical Big Data Use in Marketing Delivers Big Returns

Monday, November 18th, 2013 by

Although organizations around the world continue to have mixed feelings about Big Data, the truth is that properly planned, launched, and managed programs will deliver significant benefits to those willing to follow through with comprehensive strategies. Doing so will require diligence on behalf of both decision-makers and employees, however, as well as a strong sense of collaboration between departments, teams, and company-wide goals.

Practical big data use in marketing delivers big returns

Practical Big Data use in marketing delivers big returns

The bottom line is that Big Data initiatives are effective for those who are adamant enough to experience the advantages. This conclusion was highlighted in a recent Rocket Fuel and Forbes Insights study of more than 200 senior executives, which found that roughly 60 percent of businesses that use Big Data at least 50 percent of the time have exceeded their initial goals for the strategies. Conversely, only about a third of companies that don’t use Big Data frequently were able to meet their expectations.

The moral of this story is that planning ahead and making effective use of available data will give companies an edge in the long run. This is especially true for marketing organizations that rely on information to build customized strategies to draw in, cultivate, and retain prospective customers.

Interestingly, the study found that organizations often have conflicting ideas of how they are using Big Data and the effects that are being introduced because of these perceptions. The survey revealed that the majority of marketing agencies said they are frequently or always taking full advantage of data within advertising processes, although only about 10 percent of companies manage more than half of their promotional endeavors with Big Data.

Bringing Big Data into marketing
Information is the basis for all effective decision-making, especially in marketing. If organizations blindly adopt promotional strategies without first assessing the landscape and how those endeavors will function, they risk not only failing in their attempt to acquire new customers, but possibly losing existing ones who find such attempts unappealing.

(more…) «Practical Big Data Use in Marketing Delivers Big Returns»

Speed, Accuracy More Important than Volume for Big Data

Friday, November 15th, 2013 by

Although information has always had an important role in the business world, the Big Data movement has placed a larger emphasis on the aggregation and management of digital resources. Companies around the world are now seeking programs that help employees collect and store increasingly larger volumes of information because most decision-makers believe the chances of striking gold will increase if they have more data to analyze.

Speed, accuracy more important than volume for big data

Speed, accuracy more important than volume for Big Data

Ironically, organizations that believe in collecting massive amounts of information often find themselves inundated by the sheer size of the resources under their control and, as a result, encounter new performance obstacles that lead to substantial issues. This conundrum is forcing executives to ask, “Is bigger always better?”

The answer is no, especially in today’s Big Data world. Rather than focusing on the quantity of information they collect, organizations should focus on the quality. Because there’s no finite definition for “Big Data,” each company is forced to develop its own innovative strategy that aligns with the way employees work on a daily basis. In many cases, speed will outweigh a number of other metrics associated with the information being collected. If an organization has the power to quickly convert unstructured data into useful insight for its marketing and sales teams, for example, it will likely be able to reduce client churn and improve its ability to attract, engage, and retain customers.

Still, many experts believe that the data with the largest business impact is the most difficult to measure. The need to do so will pressure executives to build robust warehousing environments that can make sense of various information sets as quickly as possible, allowing firms of all sizes to use a broad range of information for numerous practices, improving their ability to compete and prosper in the long run.

Quality over quantity
Of course it will become increasingly difficult to find a needle if the haystack continues to grow in size. This prospect should encourage executives to build programs that regularly maintain the haystack, trimming the unnecessary heaps of hay to make sifting through the pile less complex. In terms of data management, this means that organizations need to understand the complications and opportunity costs associated with the inability to keep information storage environments in check.

(more…) «Speed, Accuracy More Important than Volume for Big Data»

Big Data = Big Confusion? Hint: The Key is Open Data Services

Wednesday, November 6th, 2013 by

When folks refer to “Big Data” these days, what is everyone really talking about? For several years now, Big Data has been THE buzzword used in conjunction with just about every technology issue imaginable. The reality, however, is that Big Data isn’t an abstract concept. Whether you like it or not, you’re already inundated with Big Data. How you source it, what insights you derive from it, and how quickly you act on it will play a major role in determining the course—and success—of your company. Let’s talk specifics…

Big-Data-3Vs

Handling the increased volume, variety and velocity (the “3V/s”) of data requires a fundamental shift in the makeup of the platform required to capture, store, and analyze the data. A platform that’s capable of handling and capitalizing on Big Data successfully requires a mix of structured data-handling relational databases, unstructured data-handling NoSQL databases, caching solutions, and map reducing Hadoop-style tools.

As the need for new technologies to handle the “3V/s” of Big Data has grown, open source solutions have become the catalysts for innovation, generating a steady launch of new, relevant products to tackle Big Data challenges. Thanks to the skyrocketing pace of innovation in specialized databases and applications, businesses can now choose from a variety of proprietary and open source solutions, depending on the database type and their specific database requirements.

Given the wide variety of new and complex solutions, however, it’s no surprise that a recent survey of IT professionals showed that more than 55% of Big Data projects fail to achieve their goals. The most significant challenge cited was a lack of understanding of and the ability to pilot the range of technologies on the market. This challenge systematically pushes companies toward a limited set of proprietary platforms that often reduce the choice down to a single technology. Perpetuating the tendency to seek one cure-all technology solution is no longer a realistic strategy. No single technology such as a database can solve every problem, especially when it comes to Big Data. Even if such a unique solution could serve multiple needs, successful companies are always trialing new solutions in the quest to perpetually innovate and thereby achieve (or maintain) a competitive edge.

Open Data Services and Big Data go hand-in-hand

(more…) «Big Data = Big Confusion? Hint: The Key is Open Data Services»

Big Data Speed May Be More Important Than Size

Thursday, October 31st, 2013 by

In today’s business world, Big Data is the hottest trend under discussion in most board rooms as decision-makers around the world try to understand how they can accurately gather, manage, analyze, and use new types of information. During the past several years, executives have concentrated on the three Vs of Big Data: volume, variety, and velocity. But there are still a few risks associated with improperly deployed Big Data initiatives, regardless of how much experience an organization has with managing large volumes of digital information.

Big data speed may be more important than size

Big Data speed may be more important than size

Information Management recently highlighted some concerns that often come along with the “more is better” mantra. The truth is that companies are collecting more raw data than they can handle, sometimes introducing unexpected security, performance, and management complexities in the process. Because the volume of data under corporate control is growing so quickly, for example, organizations may need to implement advanced storage environments even if decision-makers aren’t entirely sure what information they’re collecting. In many cases, those executives are finding solace in cloud computing architectures that provide a scalable, flexible landscape with ample storage options that can keep up with the recent information explosion.

Although the issue of where and how to maintain information assets can be simplified through the use of the cloud, there’s a more important issue that should be addressed when it comes to Big Data: how to transform raw information into a business asset.

Maintaining data value
The unprecedented growth of data is not only putting pressure on storage and security resources, it’s also introducing a unique phenomenon associated with information’s rate of decomposition. Similar to the concept of supply and demand, the more resources that are supplied, the less value they inherently possess. For businesses that are aggregating massive volumes of data, this tenant means that decision-makers need to use those assets quickly before they lose their significance.

Information Management noted that the half-life of data is quickly decelerating as more resources are generated and collected. This situation has encouraged organizations to embrace analytic technologies that work in real time, or close to it, so decision-makers can convert unstructured data into useful insights as fast as possible. That process will give many firms a competitive advantage, allowing them to improve the customer experience and even predict what customers will want or do before they actually happen.

(more…) «Big Data Speed May Be More Important Than Size»

How To Successfully Implement a Big Data Project in 8 Steps

Monday, October 28th, 2013 by

There are countless ways to incorporate Big Data to improve your company’s operations. But the hard truth is that there’s no one-size-fits-all approach when it comes to Big Data. Beyond understanding your infrastructure requirements, you still need to create an implementation plan to understand what each Big Data project will mean to your organization. At a minimum, that plan should include the following 8 steps.

Big-Data-Cloud

Step 1: Gain executive-level sponsorship

Big Data projects need to be proposed and fleshed out. They take time to scope, and without executive sponsorship and a dedicated project team, there’s a good chance they’ll fail.

Step 2: Augment rather than re-build

Start with your existing data warehouse. Your challenge is to identify and prioritize additional data sources and then determine the right hub-and-spoke technology. At this stage, you’ll want to get approval to evaluate a few options until you settle on the appropriate technology for your needs. (more…) «How To Successfully Implement a Big Data Project in 8 Steps»