Can deep observability eliminate the blind spots in cybersecurity?
When it comes to network observability, zero trust is often what comes to mind for most businesses. In fact, most security vendors continue to promote zero trust as it basically blocks all devices and connections until access is granted. However, the problem with zero trust is many organizations still struggle with implementing it.
While zero trust looks at devices and connections, who is monitoring data in motion? How are businesses able to observe the data that goes in and out from their premises to the cloud? This is where deep observability comes in.
Deep observability is defining a new frontier by complementing observability tools with real-time network intelligence to eliminate security and performance blind spots. This powerful combination empowers IT teams to proactively mitigate security and compliance risk and ease the operational complexity of managing multi-cloud IT infrastructure.
As the market leader in deep observability, Gigamon has gained unprecedented, double-digit growth in the Asia Pacific, a strongly positive market signal on the critical demand for leveraging actionable network-level intelligence to secure and efficiently manage hybrid and multi-cloud IT environments.
Gigamon is also the first company to deliver complete network visibility and analytics on all information-in-motion. The firm is looking to increase its investments in APAC in 2022, with strategic plans in place for fuelling this goal.
To understand more about deep observability and how Gigamon is helping businesses have a better view of their data and systems, Tech Wire Asia speaks to Bassam Khan, Vice President of Product and Technical Marketing at Gigamon.
“Today, every organization has a complicated network and IT infrastructure. And you need to have a lot of tools to access the network traffic, data, and such. Currently, you would need to send each of these tools to fetch the data from the different locations. When this happens, you end up with blind spots, which is a really bad thing. With Gigamon, when you plug us in one time, we bring the data to the tool. So instead of you hardcoding your connections physically into a complex and changing infrastructure, we do it for you. You will have visibility and improve security because you will have access to all data in motion,” explained Khan.
Khan also pointed out that be it on-premises, hybrid cloud, or multi-cloud, complexities will arise, especially with each of these having different ways of accessing data. For example, it may be through a physical connection. But if the business is running on Nutanix, then it would need to have a Nutanix access mechanism. The same goes for the public cloud whereby each of them has its own different access mechanism.
For Khan, these complexities lead to blind spots which lead to poor security. And Gigamon simplifies the complexity significantly.
So how does deep observability differ from zero trust?
According to Khan, deep observability is very complementary to zero trust. How zero trust works is that it identifies connections. While the connection is encrypted, it still needs to be inspected. The threat actors are using encrypted traffic to run their command and control communications, and activate their lateral movement to do all of the threat actor behaviors that they would do actions that didn’t need to do.
“So the need for visibility into all the communication that happens is foundational to zero trust. If you look back at the original zero trust document by John Kindervag almost 12 years ago, there was something he called the DAN, which was called Data Access Network. The concept there was that you have one place that provides access to all of the data, all of the traffic that’s going around. That’s foundational to zero trust. So what we’re doing is to make sure that there are no blind spots in that data access. And we call it a deposited pipeline. There are no blind spots, including container traffic, anywhere in the network, including unmanaged devices. We eliminate the blind spot because that is foundational for zero trust and very complimentary,” said Khan.
With Gigamon eliminating blind spots, Khan also highlighted that this could be the go-to model for cybersecurity in the years to come. This is because as infrastructure becomes more complicated, every single access to data needs to be inspected, a process that will only get harder, especially since it may also involve access to third-party data on the supply chain and such.
“This is where a lot of different aspects come together and make it harder, including more distributed applications. Most of our customers might access a third-party supplier database, which in turn, requires the attack surface to be exposed outside of their control. This requires multi-tier and zero trust and it’s a very proven effective practice, to improve your security posture significantly. It requires access to data that you may not be able to get from container traffic for example. Our ability to see this extends out to the supply chain network” mentioned Khan.
The same applies to developers as well. As developers would want to be free in using their applications, Gigamon’s deep observability captures and tracks the activities and movements, ensuring a secure environment.
In the second part of Tech Wire Asia’s conversation with Khan, he discusses the cost of adopting deep observability tools as well as the skills required to manage it, especially with concepts like zero trust also facing challenges in implementation.
READ MORE
- 3 Steps to Successfully Automate Copilot for Microsoft 365 Implementation
- Trustworthy AI – the Promise of Enterprise-Friendly Generative Machine Learning with Dell and NVIDIA
- Strategies for Democratizing GenAI
- The criticality of endpoint management in cybersecurity and operations
- Ethical AI: The renewed importance of safeguarding data and customer privacy in Generative AI applications