Archive for the ‘Big Data’ Category

 

Using Big Data to Identify and Prevent Crimes

Thursday, March 27th, 2014 by

Predictive analytics tools have helped major corporations gain consumer insights, using them to drive profit growth and marketing campaigns. On the other end of the spectrum, law enforcement agencies on the national and municipal levels are using Big Data to identify and predict criminal behavior. Surveillance capabilities aside, the new techniques may discourage so-called “bad behavior” throughout the United States.

A Neighborhood Watch sign in a community.

A Neighborhood Watch sign in a community.

An example of success 
The Wisconsin State Journal reported that Madison, Wisc., police authorities consulted with analysts in the surrounding areas in anticipating a December crime wave that would sweep the University of Wisconsin’s College Court area. Apparently, once students leave for winter break in December, law enforcement officials receive numerous burglary reports.

The news source noted that three crime analysts are employed by the Madison Police Department. Operating through a cloud server, the professionals are able to help officers prioritize their efforts. The unit has been with the organization for nearly 10 years, garnering headline-worthy attention when one analyst helped a detective identify patterns in a string of bank robberies that occurred earlier this year.

Caleb Kelbig, one of the data experts working with the authorities, told police in Madison and surrounding cities that the perpetrator could hit 1 of 11 possible targets on the afternoon of March 5 or 6. Amazingly, the robber appeared at one of the locations in Middleton, Wisc., at about 2:30 pm on March 5.

Prioritizing intentions, citing appropriate uses
Jignesh Patel, an expert in Big Data use and a professor at UW-Madison, noted that cloud computing has made predictive analytics tools easier to use. Developments in IT have also opened up new avenues through which digital information can be collected. For example, smartphone software has contributed significantly to the data-gathering trend.

(more…) «Using Big Data to Identify and Prevent Crimes»

What do P-Diddy & NoSQL have in common?

Thursday, March 20th, 2014 by

Ad networks are hungry to solve for the real-time information needed to support bidding and ad serving, but the solution to their challenges isn’t coming from Oracle. The solution is coming from the “bad boy” of the database world, NoSQL. NoSQL offers the low latency, scalability, and multi-data-center replication perfect for feeding the Big Data appetite of digital advertising. With so many potential use cases, GoGrid is gathering a panel of NoSQL leaders to discuss the future of their technologies and how they envision NoSQL becoming mainstream. Inspired by the original “bad boy,” Sean “P-Diddy” Combs, this meetup coincides with the first night of ad:tech San Francisco 2014.

clip_image002

For IT professionals, there isn’t a better opportunity to learn about Cassandra, Couchbase, Riak, MongoDB, and MemSQL than hearing from the exciting minds responsible for the development of the technology itself. Come by 111 Minna in San Francisco on Wednesday, March 26, to engage in a panel discussion with the leaders of Basho, Couchbase, DataStax, MemSQL, and GoGrid on how you can leverage their solutions to create real value and solve the complex use cases in your business. And be sure to grab a drink at the open bar and some great food while you’re at it! Attendance is free, but registration is required. Here are the details:

March 26, 2014
5:30 – 7:30 pm
111 Minna Gallery
111 Minna Street
San Francisco, CA 94105
1-415-974-1719

Space is limited, so register today.

The Big Data Storage Opportunity in the Cloud

Friday, February 21st, 2014 by

The Big Data phenomenon has encouraged organizations to pursue all options when accumulating increasingly diverse information sets from highly disparate sources. The trend has essentially expanded the network and caused an influx of traffic. Unfortunately, conventional IT systems with minimal or limited bandwidth simply can’t live up to the constantly changing levels of data transit. This complication is causing some organizations to stop in their tracks, ending Big Data initiatives before they can provide any proof of positive returns.

The big data storage opportunity in the cloud

The Big Data storage opportunity in the cloud

The good news is that the volume of Big Data doesn’t have to be a deterrent. Instead, experiencing problems with increasingly large amounts of information can be a wake-up call for businesses to implement new technologies like a flexible storage and warehousing environment that are capable of scaling on-demand.

Enter: cloud computing.

Although the cloud has received a lot of attention in the application development, backup, and disaster recovery markets, its highly agile nature makes it an especially beneficial solution in the Big Data realm. By implementing a cloud storage architecture, for example, organizations can gather massive amounts of information without worrying about hitting capacity. And because the cloud is so scalable, decision-makers pay only for what they need when they need it, making the hosted environment ideal for the constantly changing demands of Big Data.

So what’s the catch?
There’s no doubt that cloud infrastructure services can be an appealing technology for companies looking to take advantage of the Big Data movement without encountering bandwidth or performance issues. However, that doesn’t mean the cloud is perfect. Some firms may encounter issues when using the cloud for the first time because the hosted services themselves are relatively new. The initial migration to the cloud, for example, can be difficult for enterprises that aren’t used to outsourcing or have never used managed services of any kind.

(more…) «The Big Data Storage Opportunity in the Cloud»

Big Data Can Knock Down Technical Barriers in the Boardroom

Wednesday, February 5th, 2014 by

The boardroom plays an important role in the ongoing development of an organization. Filled with executives and partners of all types, this room is the place where most decisions about a company’s future are made. Traditionally, discussions about the trajectory of enterprise programs were built around old conversations and past experiences, which guided decision-makers to either fund or kill prospective projects, depending on how rewarding managers believed certain initiatives would be in the long run.

Big data should knock down technical barriers in the boardroom

Big Data can knock down technical barriers in the boardroom

Today, some companies are embracing similar mentalities, which isn’t necessarily helping them, especially as new technological and operational trends emerge within the enterprise. Other firms are taking a more innovative approach and embracing the tectonic shifts happening within the IT landscape. In the past, IT movements were generally ignored by the boardroom because these projects were considered too technical, which meant that most employees were unaware of the direction of IT. The Big Data phenomenon has changed all of that.

Why bring data into the boardroom?
Businesses of all sizes and industries are quickly realizing that information is the key to success. Although this ideal has been reinforced in the past, the repercussions of not using data as a guide are becoming more widely understood.

Boardroom-level decision-makers used to base their expectations on gut feelings. Doing so was sometimes beneficial, especially when executives were highly experienced and aware of what was happening within their companies and competing firms; however, there was no effective way to guarantee their beliefs were accurate until it was too late to change their minds. Instead of following this antiquated approach to decision-making, executives can now use information to their advantage.

The digital world of today is fueled by a massive increase in available information because almost every activity carried out on a smartphone, tablet, or other computing device leaves a trail of data crumbs. By gathering these scraps of information and analyzing their trajectory, organizations can essentially predict the future and build strategies to maximize return on virtually any investment.

(more…) «Big Data Can Knock Down Technical Barriers in the Boardroom»

How to Easily Deploy MongoDB in the Cloud

Monday, February 3rd, 2014 by

GoGrid has just released it’s 1-Button Deploy™ of MongoDB, available to all customers in the US-West-1 data center. This technology makes it easy to deploy either a development or production MongoDB replica set on GoGrid’s high-performance infrastructure. GoGrid’s 1-Button Deploy™ technology combines the capabilities of one of the leading NoSQL databases with our expertise in building high-performance Cloud Servers.

MongoDB is a scalable, high-performance, open source, structured storage system. MongoDB provides JSON-style document-oriented storage with full index support, sharding, sophisticated replication, and compatibility with the MapReduce paradigm. MongoDB focuses on flexibility, power, speed, and ease of use. GoGrid’s 1-Button Deploy™ of MongoDB takes advantage of our SSD Cloud Servers while making it easy to deploy a fully configured replica set.

Why GoGrid Cloud Servers?

SSD Cloud Servers have several high-performance characteristics. They all come with attached SSD storage and large available RAM for the high I/O uses common to MongoDB. MongoDB will attempt to place its working set in memory, so the ability to deploy servers with large available RAM is important. Plus, whenever MongoDB has to write to disk, SSDs provide for a more graceful transition from memory to disk. SSD Cloud Servers use a redundant 10-Gbps public and private network to ensure you have the maximum bandwidth to transfer your data. You can use can GoGrid’s 1-Button Deploy™ to provision either a 3-server development replica set or a 5-server production replica set with Firewall Service enabled.

Development Environments

The smallest recommended size for a development replica set is 3 servers. Although it’s possible to run MongoDB on a single server, you won’t be able to test failover or how a replica set behaves in production. You’ll most likely have a small working set so you won’t need as much RAM, but will still benefit from SSD storage and a fast network.

(more…) «How to Easily Deploy MongoDB in the Cloud»