Infographic: Keep your patient health info secure in the cloud

January 22nd, 2014 by - 4,251 views

Maintaining data security in the healthcare sector is hard. Although all businesses worry about securing confidential data, it doesn’t compare to the burden of companies managing personal health information that must comply with the Healthcare Insurance Portability and Accountability Act (HIPAA) and other relevant regulations. Unfortunately, the sensitive nature of these assets makes them even more desirable to cybercriminals. The result: Patient health information is being targeted more frequently and more aggressively than ever before. Fortunately, the evolving IT landscape has provided a way to address these threats: proactive security monitoring to identify and mitigate potential risks and encryption to protect the data itself.

Outside attacks are only one aspect of the problem, however: Negligent insiders are also putting their organizations at risk. Studies have shown that roughly 94% of healthcare firms have experienced at least 1 data breach within the past 2 years. Because these incidents cost the industry upwards of $7 billion per year, administrators must proactively seek strategies that cut down the chances of unwanted security problems.

Financial repercussions of a data breach

Due to the regulations governing personal health information, the reputation damage and bottom-line costs of a data breach are often exacerbated by compliance fines. What is more troubling is that these costs are only increasing in frequency and severity. Experts believe that the financial repercussions of data breaches have increased by $400,000 between 2010 and 2012, with more than half of companies losing $500,000 or more in 2012. With the price tag expected to rise 10 percent year-over-year through 2016, businesses must plan ahead to reduce these challenges.

To illustrate the effect of data breaches on healthcare organizations and the magnitude of the response required, we’ve put together the following infographic, “Keep Your Patient Health Info Secure in the Cloud.” Part of our series of 60-second guides, the graphic will show you in only a minute why the cloud is powering new ways to secure some of the most personal information available: details about our health.

GoGrid_HIPAA_Compliance_72_F

Read the rest of this entry » «Infographic: Keep your patient health info secure in the cloud»

Big Data Cloud Servers for Hadoop

January 13th, 2014 by - 2,791 views

GoGrid just launched Raw Disk Cloud Servers, the perfect choice for your Hadoop data node. These purpose-built Cloud Servers run on a redundant 10-Gbps network fabric on the latest Intel Ivy Bridge processors. What sets these servers apart, however, is the massive amount of raw storage in JBOD (Just  a Bunch of Disks) configuration. You can deploy up to 45 x 4 TB SAS disks on 1 Cloud Server.

These servers are designed to serve as Hadoop data nodes, which are typically deployed in a JBOD configuration. This setup maximizes available storage space on the server and also aids in performance. There are roughly 2 cores allocated per spindle, giving these servers additional MapReduce processing power. In addition, these disks aren’t a virtual allocation from a larger device. Each volume is actually a dedicated, physical 4 TB hard drive, so you get the full drive per volume with no initial write penalty.

Hadoop in the cloud

Most Hadoop distributions call for a name node supporting several data nodes. GoGrid offers a variety of SSD Cloud Servers that would be perfect for the Hadoop name node. Because they are also on the same 10-Gbps high-performance fabric as the Raw Disk Cloud Servers, SSD servers provide low latency private connectivity to your data nodes. I recommend using at least the X-Large SSD Cloud Server (16 GB RAM), although you may need a larger server, depending on the size of your Hadoop cluster. Because Hadoop stores metadata in memory, you’ll want more RAM if you have a lot of files to process. You can use any size Raw Disk Cloud Server, but you’ll want to deploy at least 3. Also, each Raw Disk Cloud Server has a different allocation of raw disks, which are illustrated in the table below. The Cloud Server in the illustration is the smallest size that has multiple disks per Cloud Server. Hadoop defaults to a replication factor of three, so to protect your data from failure, you’ll want to have at least 3 data nodes to distribute data. Although Hadoop attempts to replica data to different racks, there’s no guarantee that your Cloud Servers will be on different racks.

Note that the example below is for illustrative purposes only and is not representative of a typical Hadoop cluster; for example, most Cloudera and Hortonworks sizing guides start at 8 nodes. These configurations can differ greatly depending on if you intend to use the cluster for development, production, or production with HBase added. This includes the RAM and disk sizes (less of both for development, most likely more for HBase). Plus, if you’re thinking of using these nodes for production, you should consider adding a second name node.

Hadoop-cluster Read the rest of this entry » «Big Data Cloud Servers for Hadoop»

Does it take a village to ensure security (or just hard work)?

January 6th, 2014 by - 2,689 views

I watched an interview this morning where Snapchat’s CEO was discussing the recent exposure of its users’ phone numbers and names and something he said stood out for me: “Tech businesses are susceptible to hacking attacks. You have to work really, really, really hard with law enforcement, security experts, and various external and internal groups to make sure that you’re addressing security concerns.”

image

I have to agree with him: It takes a lot of effort to keep up with the latest security threats and vulnerabilities, to continuously assess existing security safeguards, to open channels of communications with security peers in other organizations, and to work with local and federal law enforcement to solve common security problems. Even companies that spend millions on security like Target are clearly challenged every day to identify and remove vulnerabilities to protect their customers’ data.

The rapid growth of cloud services and cloud service providers has only added new areas of concern for organizations hoping to leverage the benefits of the cloud. Organizations must perform their due diligence in identifying the right cloud service provider for their needs—preferably one that’s had time to develop security best practices based on firsthand experience and hard-won expertise. Securing a company’s production environment requires a cloud partner that is mature and has dedicated resources to provide robust security services and products.

Consider the recent DigitalOcean security revelation that its customers can view data from a VM previously used by another customer. According to one reporter, a DigitalOcean customer “noted that DigitalOcean was not by default scrubbing user’s data from its hard drives after a virtual machine instance was deleted.” Why not? DigitalOcean confided that the deletes were taking too long to complete and resulted in potential performance degradation of its services.

I recognize that challenge because GoGrid addressed this same issue years ago. All our deleted VMs go through an automated secure scrubbing process that ensures a previous customer’s data isn’t inadvertently shared with a new customer—and we do so without impacting our production environment. Was that easy to accomplish? No, it wasn’t. In fact, it took a lot of engineering work and resources to develop the right way to secure our customers’ data without impacting performance. Taking technical shortcuts when it comes to security often results in unexpected consequences that can affect an organization’s overall security—and ultimately, its reputation.

Read the rest of this entry » «Does it take a village to ensure security (or just hard work)?»

Big Data Holds Promise for Those Who Are Proactive

January 3rd, 2014 by - 2,268 views

Companies are increasingly drawn to the Big Data market because of the potential benefits associated with embracing innovative information aggregation, management, storage, and analytics projects. This prospect has led organizations around the world to collect constantly expanding volumes of digital resources that promise to create opportunities to gain a competitive advantage, reduce costs, or improve overall operations.

Big data holds promise to those who are proactive

Big Data holds promise for those who are proactive

At the same time, however, many businesses have encountered unforeseen challenges in their race to collect large and complex data sets. Many of these issues derive from the fact that firms are simply gathering more information than they know how to handle, which is putting pressure on outdated infrastructure services and creating headache for the IT department and executives. Many enterprises overlook the fact that Big Data requires a specific management and organizational strategy, rather than simply an ad-hoc approach.

Understand what challenges lie ahead
Businesses that want to make the most out of Big Data must recognize which challenges will impact their bottom line and identify the steps that will allow their teams to overcome those obstacles. The truth is that databases are growing much faster than ever, which has led to new bottlenecks and other performance issues, including reduced processing speeds and the presence of unexpected costs associated with mitigating those problems.

Fortunately, building a Big Data strategy from the ground up can introduce new competitive opportunities without introducing unnecessary expenses or technical complications. Although a number of steps are required to construct these programs, business decision-makers should consider developing a new plan of action rather than relying on outdated philosophies that categorize storage as a dark and desolate vault that is virtually inaccessible.

Businesses now have an abundance of technologies at their fingertips, including cloud computing and sophisticated analytics, that allow employees to optimize operations without driving costs through the roof. The cloud in particular can be highly advantageous for Big Data storage purposes because of its scalable, on-demand nature. This approach lets companies continue their current trajectory and gather increasingly large volume of information without worrying about combating as many performance bottlenecks.

Read the rest of this entry » «Big Data Holds Promise for Those Who Are Proactive»

Quality Customer Service Requires Predictive, Real-Time Insight

December 13th, 2013 by - 4,130 views

In today’s increasingly crowded and fast-paced business world, organizations are continually pressured to meet the rapidly evolving needs of prospective and existing customers. In many cases, embracing Big Data strategies will enable firms to collect large amounts of information, analyze those assets, and turn them into meaningful insights that will help build better relationships with customers, ensure retention, and enhance loyalty.

Quality customer service requires predictive, real-time insight

Quality customer service requires predictive, real-time insight

Businesses that don’t provide customers with a unique, holistic experience will feel the repercussions of not doing so faster than ever. The proliferation of social media and other highly collaborative web services gives consumers a booming voice over the Internet. This phenomenon is encouraging organizations to adopt more innovative customer service programs that allow decision-makers to understand what customers want before those demands are requested. The only way to achieve these capabilities is through the use of Big Data analytics.

The real-time necessity
Executives, Support staff, and other customer service representatives need the ability to identify poor experiences as quickly as possible because waiting too long and allowing customers to leave an interaction on a bad note can results in major long-term consequences. Although monitoring conversations between corporate employees and consumers can help decision-makers gain more insight into how their agents are handling queries, this practice doesn’t necessarily provide the time needed to make adjustments before it’s too late.

Rather than taking this traditional reactive approach, companies should become proactive. Launching predictive analytic initiatives to collect large volumes of data on prospective and existing customers can help identify current and future trends. This information gives decision-makers a unique perspective on what customers actually want, making it easier to meet (and hopefully exceed) expectations while developing a reputation for responsiveness, innovation, and ease of use.

Overcoming unforeseen hurdles
Big Data technologies hold the promise of gaining a deeper understanding of customer behavior. However, decision-makers must guard against being too ambitious by collecting every scrap of information available in the hope that a certain piece will provide insight into a certain process. Embracing Big Data in this way can potentially lead to performance and efficiency problems, even if a company uses cloud computing technologies to match the scalable needs of increasingly large and complex information sets.

Read the rest of this entry » «Quality Customer Service Requires Predictive, Real-Time Insight»