Hybrid cloud, security and containers: My morning at Structure 2016

I have been attending the Structure Conference pretty regularly for a few years. Each year, I am impressed by the diversity of discussions. The theme this year centered around hybrid clouds, security and the real value of containers. These are all great discussions and have all been written extensively.

I arrived at Structure 2016 on November 9 –  yes, the day after the U.S. election. Some folks swizzled their presentations to provide their insights.

The first presentation of the day was a wake-up call, by Bryan Cantrill. He is a high-energy and provocative speaker. Bryan is the chief technology officer for Joyent, now part of Samsung. If you haven’t heard Bryan Cantrill speak, I recommend you listen to some of his presentations on YouTube. One of my favorite Bryan presentations is from DockerCon 2015. His two themes were about how managing software going into production is like going to war, and why software can’t fail, ever. If you think it can, you are wrong. (At any rate, watch the video. Truer words were never spoken.)

At Structure, Bryan moved into the political realm. He talked about “software eating the world” (Marc Andreessen), and how with it, some jobs are being automated and plants need less people. The cloud accelerates this, since it makes it easy and quicker to write high-value applications. This change is happening across all industries, and manufacturing was impacted first. Bryan asserted that the shift left a gap in parts of America, as former manufacturing employees see opportunities shrink. This isn’t new, as technology has been automating functions while creating new opportunities for a long time. This is also happening with technology jobs. For example, you used to have to manually construct business reports using data from multiple sources. Now, with the democratization of big data and data visualization tools, people are able to build these reports automatically, needing fewer people to build the reports and cleanse the data. The hundred data scientists you need today to analyze data will probably become less than ten employee roles as applications become automated. Folks need to move to a model where they are continuously learning to be ready for the next wave. It’s not an easy feat – and not something everyone is excited about.  

The tone about clouds at Structure 2016 was that one size doesn’t fit all anymore. There was much more discussion this year about hybrid clouds and the real cost of moving to the public cloud. Matt Wood, general manager of product strategy at AWS, was asked if it was more cost-effective to stay on premise if a workload was stable in terms of compute and storage. Matt didn’t really have time to answer the question, and the answer sort of sounded like “No it isn’t, believe me.” Unfortunately, I didn’t think anyone in the audience did believe him. I was surprised when Mark said the most common applications in the cloud were “test and dev.” Folks start there and move on to high-value deployments over time.

The panel I was on, “Now that I have all this data, where do I put it?” was moderated by Barb Darrow and supposed to talk about where data should live. The topic quickly morphed into a discussion about security, following questions about whether people were rethinking their cloud strategies after the outages caused by the recent attack on Dyn. First, Dyn was under a galactic attack. It is amazing the company came back so quickly. However, the incident does show is that people going to the cloud are outsourcing their IT to other folks. Yup, that was the purpose of the attack. It’s not a bad thing to outsource, but before you do so, you should ensure your providers have no single point of failure. Just as you would have had to plan for if you managing the service yourself. So as much as you can (I know it’s difficult when you are buying SaaS products), find out how the company you are buying from thinks about and manages the availability of its services.  Just like you outsourcing availability to the provider, you are also outsourcing data security. Do not become complacent, and don’t assume encryption in transit and at rest is enough. You need to really understand what data you are putting in the cloud. Some data has regulatory requirements to protect. You need to know where this type of data is at all times, and who is accessing. This gets tricky as data moves around the ether. Cloud access security brokers (CASBs) are designed to help you keep your secure data on premise, and if you are moving to the cloud, to understand where it’s gone and who moved it.

Data  security comes down to understanding where your data is and having smart policies around which types of data can live where. Someone has probably told you in your life, nothing is for free. This is especially true if you think you can outsource data availability and security without monitoring it closely. You would never blindly allow someone take ownership of our more precious things in your personal life, you need to use the care in your professional life. It all starts with knowing what you have and knowing what you care about.

Find out what’s in your data with a free data security assessment.

  Like This

Paula Long

Paula Long is the CEO and co-founder of DataGravity. She previously co-founded storage provider EqualLogic, which was acquired by Dell for $1.4 billion in 2008. She remained at Dell as vice president of storage until 2010. Prior to EqualLogic, she served in engineering management positions at Allaire Corporation and oversaw the ClusterCATS product line at Bright Tiger Technologies. She is a graduate of Westfield State College.