Data Center Knowledge

With the bulk growth of cloud, it is clear that many organizations are giving some kind of cloud model to optimize their business. Big cloud service providers are doing a great job around security, some areas are there for improvement within a private data center. Smaller cloud providers ensure always integrity of client base. Three important points are there to consider:

  • Cost of data breaches increased: To break downward trend over past two years, both organizational cost of data failure or stolen record has increased. Cost of a data breach for an organization represented in the study increased from $5.4 million to $5.9 million.
  • Malicious attacks result in highest per capita data breach cost: Consistent with prior reports, ex-filtration or data loss resulting from malicious attack produces the highest cost at an average of $246 per compromised record. Both system glitches and employee mistakes result in much lower average per capita costs $171 and $160 respectively.
  • A probability of material data breach is shown by result over next two years involves a minimum of 10,000 records is nearly 19 percent.

Areas need for improvement in cloud security aspects in terms of creating cloud platform:

Checking for port openings: If it's a small organization, will be a bit easier. But in case of large cloud organizations, if anyone has multiple data center points and different firewalls to manage. How well anyone keeps his eye on port controls, policies and resources are distributed? Network, port and security policy misconfigurations are causes for failure. If anybody has heterogeneous security architecture, there are some tools help in monitoring security appliances from different manufacturers.

Improper position of Hypervisors and VMs outside-facing: In many cases, a VM should face externally facing or in DMZ a hypervisor needs to be positioned. It is difficult to take extra care by these kinds of infrastructure workloads. Other internal resources interact or not. Network policies are controlling access to Hypervisor and VMs. The hypervisor of a user has access to a lot of critical components within his data center. If it will not be locked down properly then host-level access can be dangerous.

Portals, databases, and applications not locking down properly: User can have best implicit server, hypervisor and data center architecture but if some holes are there in an application then they have other problems as well. Some big failures may happen because an application was not patterned and a database was not locked down properly. These applications are proved via the cloud so this can't be overlooked especially.

What users are monitoring externally vs internally: Monitoring and visibility are important to keep secure cloud and data center architecture. Management and Log correlation allow a user to catch subject quickly and isolate them to network segments, VMs or physical server. Users are allowed to control a flow of information granularly by security tools within own ecosystem. So that user can specify one server communicates over a particular VLAN pointing to a specific port on a unique switch. Data can be encrypted internally and externally. This key is capable to monitor all process and automate responses. Not only better visibility is created but also security model will be more proactive.

A lot of moving parts is there in the cloud. Like gears, complex workloads are allowed to be delivered by all these parts, work together for a variety of users spanning the world. Cloud adoption will be growing continuously. By testing and monitoring own cloud and data center environment and applying best practices of security, users will be prepared for anything that comes in the way.