KML_FLASHEMBED_PROCESS_SCRIPT_CALLS

Unpacking “the Internet of Things”

July 29th, 2014 by - 298 views

If you’ve paged through a business or technology magazine in the past several years, you’ve definitely come across the term “Internet of Things” while looking for news on Big Data. But what does it actually mean? Unpacking the term can be a hefty but necessary task to push the cloud computing concept into the zeitgeist, which many believe will happen in the near future.

Deconstructing an oft-confused term and exposing its true meaning.

Deconstructing an oft-confused term and exposing its true meaning.

What is it?
According to Techopedia, the Internet of Things, or IoT, is “a computing concept that describes a future where everyday physical objects will be connected to the Internet and be able to identify themselves to other devices.”

Still sound like a science fiction movie? The truth is, devices are already being programmed and designed with this eventual goal in mind. As Wi-Fi spreads to become more convenient and working across multiple platforms becomes the norm, society will gradually inch closer to being one with the Internet of Things.

“The term is closely identified with RFID [radio-frequency identification] as the method of communication, although it may include other sensor technologies, wireless technologies, or QR codes,” the definition continued.

Consider how cloud hosting already impacts our everyday lives via information sharing on mobile devices as well as interaction with Big Data through the way we are marketed to, how we receive information every day, and the way we consume our media. Technology is becoming ubiquitous: many public spaces are now equipped with Wi-Fi and even our televisions and houses are “smart” enough to interact via security systems, for example. In many ways, we’re already well on our way to achieving this future that sounds straight out of a Hollywood script.

When will it become a reality?

Not surprisingly, there’s no exact answer for when the Internet of Things will be considered complete and normalized in our global culture. First, continual achievement in technology needs to reach a stage where there’s less room for error, and second, industrialization needs to transform the amount of access remote and undeveloped areas have to Internet-friendly devices.

Read the rest of this entry » «Unpacking “the Internet of Things”»

What does “any cloud” orchestration mean for telcos?

July 28th, 2014 by - 326 views

Last week, GoGrid announced its “any cloud” orchestration engine service, enabling telcos to deliver complex, on-demand solutions, multi-cloud support, and data sovereignty compliance. We received an overwhelmingly positive response to our pioneering technology and unique approach.

Orchestration-engine-service

But don’t just take our word for it. Check out what Light Reading (the premier publication for the telecom industry) had to say in its story, “GoGrid Puts Cloud at a Touch of a Button.” As reported by Carol Wilson, the article underscores the reasons why orchestration is so attractive to telcos:

“The attraction is taking the complexity out of provisioning applications, and while GoGrid is initially focused on big data apps, this approach can support other kinds of applications as well, says Caroline Chappell, senior analyst with Heavy Reading. Telecom cloud service providers have been trying to develop this capability themselves so they can offer business customers the ability to stand up a whole range of applications in the cloud without individually engineering each one through a complex process, she says.

“Telecom cloud providers can make the provisioning process part of the service they offer, even if the application itself is run by an app partner in a third-party cloud, Chappell notes. That creates the ability for telecom cloud providers to offer a cloud-based “business in a box” type offer that doesn’t require complex provisioning of each individual service.”

Orchestration is key for telcos to quickly and easily increase the variety and number of products they offer. It also keeps them focused on what they do best: selling solutions wrapped with value-added services instead of commodity infrastructure. Click here to learn more about GoGrid’s orchestration and 1-Button Deploy™ solutions.

How Big Data Tells a Story

July 24th, 2014 by - 720 views

Associations with Big Data tend to be pretty clinical – it’s often considered a tool to make more accurate scientific statements, identify trends in social media and news, and develop products by gauging customer response. In other words, the cloud computing tool was largely viewed as a shortcut to making money and creating new offerings for the public, whether that was a breakthrough medication, a new way to communicate wirelessly, or something the world had never even heard of. A less common but equally fascinating use of the technology, however, is as a storytelling mechanism – a capability that can be the most powerful use of all.

As has been the truth in past generations, science and storytelling should coexist in order to remain powerful, a fact that rings true when considering the developing uses of big data.

As in previous generations, science and storytelling need to coexist to remain powerful, a fact that rings true when considering the developing uses of Big Data.

The value of storytelling
The concept of storytelling and the value of its teller is a tradition ingrained in basic human culture that has existed for thousands of years. In generations past, before the written word and widespread publishing of books and magazines, storytellers would enthrall listeners with memorized speeches in the manner of Ovid’s “Metamorphoses” and Homer’s “Odyssey.” A recent piece on the Fast CoCreate blog detailed some of the finer points of this tradition.

“Results repeatedly show that our attitudes, fears, hopes, and values are strongly influenced by story,” the source stated. “In fact, fiction seems to be more effective at changing beliefs than writing that is specifically designed to persuade through argument and evidence.”

These statements have plenty of evidence to back them up – stories sell. The movie and publishing industry bring in billions every year, and even our most prevalent social media tools, especially Facebook, are designed to tell the “story” of a user’s life online by highlighting what events and posts have received the most attention. This is just one example of mass data being boiled down to a basic storyline, but it’s a valuable one. Even Snapchat, the ever-present application that is famous for showing a user an image for a few seconds that disappears shortly thereafter, has introduced the “Snapchat Stories” feature that lets users create a narrative from their brief messages.

How Big Data tells a story with accuracy and impact
There’s no doubt that the science behind Big Data is inescapable, but some data scientists have struggled to transform this information into a palatable story for the everyday user to consume. Jeff Bladt and Bob Filbin, data scientists for the activist charity-driven website Dosomething.org, wrote about this process, with which they’re still constantly experimenting, in a recent issue of Harvard Business Review.

Read the rest of this entry » «How Big Data Tells a Story»

Infographic: Big Data or Big Confusion? The Key is Open Data Services

July 22nd, 2014 by - 7,189 views

When folks refer to “Big Data” these days, what is everyone really talking about? For several years now, Big Data has been THE buzzword used in conjunction with just about every technology issue imaginable. The reality, however, is that Big Data isn’t an abstract concept. Whether you like it or not, you’re already inundated with Big Data. How you source it, what insights you derive from it, and how quickly you act on it will play a major role in determining the course—and success—of your company. To help you get started understanding the key Big Data trends, take a look at this infographic: “60-Second Guide to Big Data and the Cloud.”

GoGrid_BigData_revised_300

Handling the increased volume, variety, and velocity—the “3V/s”—of data (shown in the center of the infographic) requires a fundamental shift in the makeup of the platform required to capture, store, and analyze the data. A platform that’s capable of handling and capitalizing on Big Data successfully requires a mix of structured data-handling relational databases, unstructured data-handling NoSQL databases, caching solutions, and map reducing Hadoop-style tools.

As the need for new technologies to handle the “3V/s” of Big Data has grown, open source solutions have become the catalysts for innovation, generating a steady launch of new, relevant products to tackle Big Data challenges. Thanks to the skyrocketing pace of innovation in specialized databases and applications, businesses can now choose from a variety of proprietary and open source solutions, depending on the database type and their specific database requirements.

Given the wide variety of new and complex solutions, however, it’s no surprise that a recent survey of IT professionals showed that more than 55% of Big Data projects fail to achieve their goals. The most significant challenge cited was a lack of understanding of and the ability to pilot the range of technologies on the market. This challenge systematically pushes companies toward a limited set of proprietary platforms that often reduce the choice down to a single technology. Perpetuating the tendency to seek one cure-all technology solution is no longer a realistic strategy. No single technology such as a database can solve every problem, especially when it comes to Big Data. Even if such a unique solution could serve multiple needs, successful companies are always trialing new solutions in the quest to perpetually innovate and thereby achieve (or maintain) a competitive edge.

Open Data Services and Big Data go hand-in-hand

Read the rest of this entry » «Infographic: Big Data or Big Confusion? The Key is Open Data Services»

Architecting for High Availability in the Cloud

July 22nd, 2014 by - 803 views

An introduction to multi-cloud distributed application architecture

In this blog, we’ll explore how to architect a highly available (HA) distributed application in the cloud. For those new to the concept of high availability, I’m referring to the availability of the application cluster as well as the ability to failover or scale as needed. The ability to failover or scale out horizontally to meet demand ensures the application is highly available. Examples of applications that benefit from HA architectures are databases applications, file-sharing networks, social applications, health monitoring applications, and eCommerce websites. So, where do you start? The easiest way to understand the concepts is simply to walk through the 3 steps of a web application setup in the cloud.

Step 1: Setting up a distributed, fault-tolerant web application architecture

In general, the application architecture can be pretty simple: perhaps just a load-balanced web front end running on multiple servers and maybe a NoSQL database like Cassandra. When you’re developing, you can get away with a single server, but once you move into production you’ll want to snapshot your web front end and spread the application across multiple servers. This approach lets you balance traffic and scale out the web front end as needed. In GoGrid, you can do this for free using our Dynamic Load Balancers. Point and click to provision the servers as needed, and then point the load balancer(s) to those servers. The process is simple, so setting up a load-balanced web front end should only take a few minutes. Any data captured or used by the servers will of course be stored in the Cassandra cluster, which is already designed to be HA.

image

Deploying the Cassandra cluster. In GoGrid, you can use our 1-Button Deploy™ technology to set up the Cassandra cluster in about 10 minutes. This will provision the cluster for your database. Cassandra is built to be HA so if one server fails, the load is distributed across the cluster and your application isn’t impacted. Below is a sample Cassandra cluster. A minimal deployment has 3 nodes to ensure HA and the cluster is connected via the private VLAN. It’s a good idea to firewall the database servers and eliminate connectivity to the public VLAN. With our production 1-Button Deploy™ solution, the cluster is configured to include a firewall on-demand (for free). In another blog post I’ll discuss how to secure the entire environment: setting up firewalls around your database and your web application as well as working with IDS and IPS monitoring tools and DDoS mitigation services. For the moment, however, your database and web application clusters would look something like this:

image

Read the rest of this entry » «Architecting for High Availability in the Cloud»