Advertisement

5 tips for securing apps across multiple clouds

Securing multiple clouds is a tricky business. Experts suggest why building protections from the inside out offers the best approach over channeling workloads to specific clouds.

Creating a hybrid cloud computing environment – one that relies on a mix of on-premises and public cloud services – holds out the promise of greater flexibility for federal agencies on where to run diverse workloads and applications. It also promises greater computing economies. But enforcing policies and security controls between and among multiple clouds can be tricky business.

Agency managers can encounter a myriad of challenges as they use multiple cloud providers in conjunction with their own internal private cloud infrastructure. Issues surrounding compliance, data flow and protection, security, and visibility of information can complicate even the most basic expansion moves, including how to divide up data workloads.

One approach is to use the private cloud for the predictable volumes of workloads and the public cloud for variable or bursts of workloads.

Anil Karmel — the former deputy chief technology officer with the National Nuclear Security Administration and now CEO of C2 Labs, a cloud security and services company — suggests a different approach. Karmel is a proponent of agency managers building their hybrid clouds from the inside out and then tying the various cloud infrastructures into a unified management plane. That way, organizations can use the same toolsets and security profiles across multiple, interconnected clouds, he said.

Advertisement

“We live in a multi-cloud era,” Karmel said. No single service provider, whether on premises or a commercial cloud service provider, can offer all the services required by a large organization, he noted.

Karmel has an added perspective as co-chair of the National Institute of Standards and Technology’s Cloud Security Working Group, which is working on a new draft of a Risk Management Framework for Cloud. That framework is expected to help organizations quantify the risks of moving their data or applications to the cloud.

In an interview with FedScoop, Karmel offered best practices for federal managers working in a multi-cloud environment.

1. Understand your data. In order to quantify your risk, you have to understand your data. That means quantifying the value of the information that has to be protected. “There is no way you can devise security controls around your data if you don’t know what the value of that information is,” Karmel said. For instance, public information would not have the same level of security as agency-sensitive information. “By understanding the value of your information, you can select the appropriate security controls to place around that information to protect it.”

2. Decompose your data. Information resides in applications. As applications have been redeveloped for the cloud by developers using agile principles, software containers, such as Docker, have risen in popularity. But agency IT departments need to place greater focus on a micro-services architecture to take advantage of those applications. By decomposing applications into smaller functions, IT personnel can move applications from environments running on virtual machines to those running inside application containers, which reside on virtual machines. That makes it easier to move from a one-to-one (application to virtual machine) approach, to a one-to-many (application container to virtual machine) approach. These applications can then be scaled – either horizontally or vertically – across cloud service providers or environments at a click of the button.

Advertisement

There are multiple benefits to working this way. For one, applications can be seamlessly moved across multiple clouds and multiple development, testing, quality assurance and production environments. Secondly, IT managers can ensure there are no deviations from the base code across those environments. From a security standpoint, though, you still need a platform to manage and secure these containers.

3. Monitor your data. Organizations need to take a defense in-depth approach, using a myriad of security tools to monitor on-premises and public cloud services, coupled with threat intelligence leveraging global threat intelligence feeds. “No one security tool is going to give you full scope visibility across your environment,” Karmel said. This approach allows organizations to correlate activities to expose potential adversaries.

By employing the best practices of understanding your data, decomposing the data and monitoring your data, agency IT teams can effectively secure their agency’s information in a multi-provider cloud environment. Furthermore, leveraging the forthcoming NIST Risk Management Framework for Cloud, managers will have a framework to quantify their risk from the customers’ perspective, Karmel said.

Pete Lindstrom, research director of security solutions with IDC, pointed to an additional set of reasons why the policy enforcement between and among multiple clouds is tricky. Managers need to protect their resources in different ways. Technology can aid in this effort, he said, offering these additional recommendations:

4. Consider implementing encryption overlay networks. Networks built on top of other networks, or overlay networks, can provide additional protections by securing data in motion, connecting resources across clouds or isolating applications inside a single cloud region. “Look at how you are creating encrypted tunnels between and among clouds,” Lindstrom said. Zones of trust typically disappear in a multi-cloud environment so firewalls are not as effective. He stressed the importance of creating “pathways and pipelines through all the clouds” with local area network overlays and multiple types of encryption.

Advertisement

5. Consider cloud security gateways at the application level. Cloud security gateways allow managers to maintain control over cloud-based resources and applications, applying policy controls when needed, and at the same time, monitor activity within the application layer on the network. They operate either physically or logically inline between users and applications and can shut down a session, apply surgical encryption or masking, identify malicious behavior as well as perform other actions, according to an IDC report. Vendors providing these kinds of tools include Bitglass, CipherCloud, Imperva, Netskope, Microsoft (which just bought Adallom), and Skyhigh Networks.

There is still a lot of work to be done to effectively manage and secure multiple cloud environments. IDC analysts think the categories of cloud management broker and cloud service broker can be merged into one entity: “a cloud broker, which would manage performance and delivery of cloud services as well as negotiate relationships between cloud providers and cloud consumers.”

But the days of simply dividing data and applications between on-premises data centers and remote cloud providers appear to be waning.

Latest Podcasts