KML_FLASHEMBED_PROCESS_SCRIPT_CALLS

Archive for the ‘Big Data’ Category

 

Farmers Use Big Data to Improve Crop Yields

Wednesday, May 14th, 2014 by

For the past few years, scientists throughout the world have referenced an impending food shortage of global proportions. The prospect of feeding 9 billion people in the year 2050 is intimidating, motivating organizations to turn to advanced technology. If harnessed properly, Big Data could help agriculturalists and food companies find ways to supply a world population that’s increasing dramatically.

A farmer reaps his wheat crop.

A farmer reaps his wheat crop.

Moving into the 21st century 
When the farming industry comes to mind, people often think of an archaic, anachronistic practice that lags behind when it comes to technological progression. Although every other sector seems to be adopting cloud computing, advanced software solutions, and analytics programs, agriculture appears to have been left in the dust.

Even though such a perception may be widespread, there’s no denying the sector’s importance: “No farms, no food” is the way numerous bumper stickers read. Yet, it’s important to remember that big agriculture corporations like Monsanto consistently fund and launch highly sophisticated research and development projects aimed toward improving production rates and promoting sustainability.

TechRepublic reported that Monsanto uses data analytics tools to help farmers achieve greater crop yields, employ fewer chemicals, and reduce water usage, leading to wider profit margins and more sustainable farming practices. The news source noted that the company estimated increased use of algorithmic information scrutiny could potentially lead to a $20 billion per year increase in worldwide crop production.

Starting at the ground level 
According to a study conducted in 2012 by PrecisionAg Institute, soybean growers that used data analysis applications reported average savings of 15 percent on expenses such as seed, fertilizer, fungicide, herbicide, and other chemicals. These deductions result in more affordable food products, enabling consumers of limited means to buy more.

(more…) «Farmers Use Big Data to Improve Crop Yields»

An “Expert Interview” with Kole Hicks, Sr. Director of Products at GoGrid

Thursday, May 8th, 2014 by

If you’re interested in learning the answers to many common Big Data implementation questions  Syncsort, a leading provider of Big Data integration solutions, recently posted an interesting blog with our very own Kole Hicks, Sr. Director of Products. In this interview, blogger Mark Underwood proposes several key questions to consider when beginning a Big Data project, starting with “What are the biggest obstacles?” and going all the way to “What are the in-house requirements for Big Data?”

Syncsort-blog

Check out the complete interview by clicking here.

And of course if you’re interested in a Big Data solutions integrator, the combination of Syncsort and GoGrid infrastructure might just be an ideal way to get you up and running with the push of a button!

You can learn more about Syncsort on its website.

HBase Made Simple

Wednesday, April 30th, 2014 by

GoGrid has just released its 1-Button Deploy™ of HBase, available to all customers in the US-West-1 data center. This technology makes it easy to deploy either a development or production HBase cluster on GoGrid’s high-performance infrastructure. GoGrid’s 1-Button Deploy™ technology combines the capabilities of one of the leading NoSQL databases with our expertise in building high-performance Cloud Servers.

HBase is a scalable, high-performance, open-source database. HBase is often called the Hadoop distributed database – it leverages the Hadoop framework but adds several capabilities such as real-time queries and the ability to organize data into a table-like structure. GoGrid’s 1-Button Deploy™ of HBase takes advantage of our SSD and Raw Disk Cloud Servers while making it easy to deploy a fully configured cluster. GoGrid deploys the latest Hortonworks’ distribution of HBase on Hadoop 2.0. If you’ve ever tried to deploy HBase or Hadoop yourself, you know it can be challenging. GoGrid’s 1-button Deploy™ does all the heavy lifting and applies all the recommended configurations to ensure a smooth path to deployment.

Why GoGrid Cloud Servers?

SSD Cloud Servers have several high-performance characteristics. They all come with attached SSD storage and large available RAM for the high I/O uses common to HBase. The Name Nodes benefit from the large RAM options available on SSD Cloud Servers and the Data Nodes use our Raw Disk Cloud Servers, which are configured as JBOD (Just a Bunch of Disks). This is the recommended disk configuration for Data Nodes, and GoGrid is one of the first providers to offer this configuration in a Cloud Server. Both SSD and Raw Disk Cloud Servers use a redundant 10-Gbps public and private network to ensure you have the maximum bandwidth to transfer your data. Plus, the cloud makes it easy to add more Data Nodes to your cluster as needed. You can use GoGrid’s 1-Button Deploy™ to provision either a 5-server development cluster or an 11-server production cluster with Firewall Service enabled.

Development Environments

The smallest recommended size for a development cluster is 5 servers. Although it’s possible to run HBase on a single server, you won’t be able to test failover or how data is replicated across nodes. You’ll most likely have a small database so you won’t need as much RAM, but will still benefit from SSD storage and a fast network. The Data Nodes use Raw Disk Cloud Servers and are configured with a replication factor of 3.

(more…) «HBase Made Simple»

How can businesses make the most of their data?

Thursday, April 24th, 2014 by

When businesses attempt to harness Big Data, they’re looking to obtain actionable intelligence that can influence key business decisions. A variety of tools to do so are now available, but executives often get lost in the process of selecting which program would best suit their requirements. If a company needs to determine how a specific action will affect a particular industry, predictive analytics is probably the right choice for them. If a merchandiser wants to figure out how a single customer interacts with its brand, then descriptive tools may be the best option.

Organizing a plan to satisfy a customer.

Organizing a plan to satisfy a customer.

Know what you’re working with
Trying to draw conclusions from raw data aggregated onto cloud servers is both inefficient and ineffective. A company could collect all the data it wants, but if there’s no way of managing and segregating the information, then hastily made conclusions could send the company in the wrong direction. In addition, how professionals perceive the intelligence should not be manipulated by how they want to interpret it.

When it comes to understanding data, an open mind is mandatory. If tailored data displays a slight or entirely different angle on a particular situation, it’s better for management to adjust their plans according to the information as opposed to distorting the meaning of the digital information so that it better coincides with an original business strategy.

Interpreting phenomenon
Ultimately, data analytics gives C-suite professionals the ability to navigate through previously undecipherable patterns. ITWeb contributor Goran Dragosavac stated that there are three primary kinds of intelligence scrutiny platforms that draw considerably different conclusions from a single marketplace. Depending on what kind of business a particular company is in, the usefulness of each platform may vary significantly.

1. Predictive analytics examines the events of the past and present to determine which events will most likely transpire in the future. How can the current actions of a company manipulate the outcome? What should the business do to change the end result?

(more…) «How can businesses make the most of their data?»

How Public Organizations Should Treat Big Data

Tuesday, April 22nd, 2014 by

Though the “only human” argument certainly doesn’t apply to Big Data, enterprises and public organizations often expect too much out of the technology. Some executives are frustrated by results that don’t necessarily correlate with their predetermined business plans, and others consider one-time predictive conclusions to be final. The problem is, there’s no guarantee that analytical results will be “right.”

A government-themed action key

A government-themed action key

Public authorities interested in integrating Big Data into their cloud servers need to understand two things. First, digital information possess no political agenda, lacks emotion, and perceives the world in a completely pragmatic manner. And second, data changes as time progresses. For example, just because a county in Maine experienced a particularly rainy Spring doesn’t mean that farming soil will remain moist — future weather conditions may drastically manipulate the environment.

Benefiting from “incorrect” data
If a data analysis program harvests information from one source over the course of 1 hour and then attempts to develop conclusions, the system’s deductions will be correct to the extent that it accurately translated ones and zeroes into actionable intelligence. However, because the place from which the data was aggregated continues to produce new, variable knowledge, it may eventually contradict the original deduction.

Tim Hartford, a contributor to Financial Times, cited Google’s use of predictive analytics tools to chart how many people would be affected by influenza by using algorithms to scrutinize over 50 million search terms. The problem was, 4 years after the project was underway, the company’s system was disenfranchised by the Center for Disease Control and Prevention’s recent aggregation of data, showing that Google’s estimates of the spread of flu-like illnesses were overstated by a 2:1 ratio.

Taking the good with the bad
Although Hartford exemplified Google’s failure as a way of implying that Big Data isn’t what software developers are claiming it to be, Forbes contributor Adam Ozimek noted that the study displayed one of the advantages of the technology: The ability to reject conclusions due to consistently updated information. Furthermore, it’s important to note that Google only collected intelligence from one source, whereas the CDC was amassing data from numerous resources.

(more…) «How Public Organizations Should Treat Big Data»