2020’s sudden shift to remote work presented organisations with a major challenge: to protect a skyrocketing volume of data from vulnerabilities. With many countries beginning to embrace a ‘new normal’, organisations are confronted with a new challenge — to protect the data of a hybrid workforce, as well as the data of their stakeholders. The answer lies deeper than database security — and requires the entire data centre.
In the past, the focus has been on code vulnerabilities and access but, in an era of microservices, distributed systems and online data processing, the focus will be on securing communications channels — especially due to the ‘new normal’ of work. Expect to see an increased focus on inter-process security and more secure communications between applications. We must learn how to trust the information from the source, ensuring this information goes to the right people and the data is safe and properly authenticated and authorised.
The challenge here lies in the fact that many organisations use databases that were built over 30 years ago. Developing a new version of these databases, in closed source settings, most often means the existing versions must also continue working – leaving the product with a technical mortgage. This can leave data exposed, opening up fertile ground for bugs and security leaks. If 30-year-old technology doesn’t evolve, you end up with a clunky, muddled system.
This is where open source database technologies like Postgres succeed. The community is passionate about building for strength, and has no fear of making changes to things that no longer work as well as they did in the past.
While misconceptions remain about open source technology, the challenges of closed source databases must also be considered. For instance, customers are often required to spend more to maintain these databases than an open source alternative. The drive of open source is to ensure organisations are not held back by features that won’t repay their time in value. Open source allows them to secure the databases to make sure that clusters are hyper personalised – ensuring the entire database isn’t opened up to vulnerabilities.
Predicting the future of the cloud and open source
Heikki Nousiainen, CTO of Aiven, provides his predictions for the future of the cloud and open source technology. Read here
A closed database is perceived to be more secure because it’s backed by a single vendor that has a strong focus on security. If you want to exploit a closed source database, you may struggle to find a vulnerability — and the vendor is often working hard to patch any holes and maintain its security.
However, if anyone can expose open source vulnerabilities, then, by the same token, anyone can identify and fix them — allowing prevention measures to be built quicker and more securely without relying on an individual to safeguard them. As more people are reading the source code, more people are able to build patches and fixes — making the core code safer.
While it can be argued that open source databases lack a specific vendor, commercial partners which provide enterprise-ready software and services can bridge the difference. The open source community can be supported by aiding implementation of safeguarding, and picking up the gaps that may remain — ensuring that patches are released early and as often as possible, whilst providing a 24/7 service that helps organisations remain ‘always on’, protecting themselves and their stakeholders.
As the transition to wide-scale hybrid working begins, we’re not where we need to be yet — there’s still much to be done to properly secure data in the ‘new normal’ environment. However, organisations can best protect themselves and their stakeholders by investing in open source databases to ensure peace of mind — at a fair cost.