Today’s businesses need to adapt to changing needs and seize new opportunities. Consider Amazon. They started as a company that sold books via mail order and then moved into selling other retail items. To support their growing e-commerce business, they created large server farms. These also served to expedite processing through their fulfillment centers. When they realized that they had idle computing resources, they decided to offer those resources to other organizations, and a new industry was born. In 2020, AWS accounted for nearly 60% of its profit. Without the ability to spot the trends and adapt to changes, they may have missed their chance at spawning the cloud computing revolution.
Your business may not be considering such a significant change in direction but being able to spot trends is an important factor in creating an agile enterprise. Agility means that your company can move in a different direction when a new opportunity is identified. This may be something as simple as communicating with your customers in a different way or it may lead to the introduction of a new product or service. In any case, without the ability to quickly spot changing needs, it is difficult to adapt to those new needs.
Cost is also a factor in agility. Determining changing trends is best served by running intense analysis on your data. In the past, performing advanced analytics often meant the addition of an entirely new database that supported analytics. You still had your legacy transaction processing environment and now added an entirely new environment. This new environment came with costs attached, and those costs were in addition to the existing infrastructure costs. Companies routinely overprovisioned their data environments to accommodate activity surges, such as may be seen on Black Friday for a retailer or the launch of a new game for a gaming company. Overprovisioning the environment means that peak activity can be accommodated but that came with an ongoing cost. Having a second analytics environment also added to operating costs.
Remember the data center?
Not long ago, many companies built, maintained and managed their own data centers. A bespoke data center carried a lot of costs. Raised and reinforced floors, customized cooling equipment, and more were needed to store all the hardware needed to support your transactional and analytical requirements. When new hardware had to be acquired, there was often a lengthy planning, requisitioning, installation, and integration process that had to be followed to add the new or upgraded hardware to the environment. In many cases, it was easier and more efficient to overbuild for your needs so that you could accommodate growth. Physical and digital security was needed to maintain data availability and privacy. There was a team of people who were tasked with the care and feeding of the servers and the overall environment, and they could be called into action at any time of the day.
While all of this was important and necessary to keep your data available to those who needed it (and unavailable to those that do not), the hassles and costs associated with it were high. Companies would invest heavily in data environments only to find that the recommendation for growth required major changes or a gutting of the current environments to accommodate a new model. This also caused user frustration since, by the time people grew accustomed to one environment, they found that it was being replaced by another model.
Enter the Cloud
Luckily for us, the days when the only option was to have your own massive data center housing all your servers and applications have gone by the wayside. Cloud computing, introduced to the mainstream by Amazon Web Services (AWS) in 2002, provided on-demand computing and storage capabilities. New environments could be created more easily, and existing environments could be augmented speedily. Of greater benefit to many companies, the costs associated with building and maintaining a data center and all its supporting needs could be removed from their direct control. Of course, the cost of the cloud environment was designed to offset those costs, but it was viewed as being more stable and predictable.
With the success of the cloud, more companies joined the fray. Gartner forecasts cloud computing to grow by as much as 21% in 2022, accounting for $482 billion in spending.
For many companies, moving to the cloud meant a “lift and shift” process, wherein they migrated their current compute and storage capabilities from hardware that they owned and controlled to similar hardware which the cloud provider owned and controlled. Over time, the concept of cloud-native capabilities came into being. Cloud-native apps are designed to specifically exploit the benefits of the cloud. Such apps are normally more modular and distributed and flex more easily to accommodate changing needs. They are designed to accommodate change more readily and are usually easier to implement. Contrary to the name, an app that is cloud-native does not need to run in the cloud; some companies run cloud-native apps on their own managed environments and still reap the benefits of their flexibility.
Distributed computing
New, cloud-native databases offer a distributed model and provide the ability to scale appropriately to meet ever-changing needs. By linking together several individual nodes into a cluster, we can now adjust our data environment quickly, so the ability to react to changing data flow requirements is met. But it is still often the case that organizations rely on a cumbersome Extract, Transform, and Load (ETL) process to get data from their transactional database to an analytical environment.
When an ETL process comes into play, some issues can arise. You need to ensure that the process is running and is copying the data appropriately. There is an inevitable time delay between when the data is captured by your transactional database and when it is copied over to the second environment. There is also the additional operational cost of building, maintaining, and running a separate environment for analytics. Additionally, your users need to know which environment they should query for each request. While a single query can be responded to by both environments, one will usually be more efficient than the other, both in the time it takes to complete the query and the volume of resources required to respond.
NewSQL – Next-generation databases
To accommodate business growth, many companies are adopting next-generation databases. Initially primarily NoSQL products, next-generation databases bring together the attributes of cloud-native, distributed computing environments into a database world. Over time, NewSQL arrived, taking the next-generation database to a broader world. Such databases deliver SQL compatibility and ACID compliance, which is often missing from NoSQL databases, with the scalability and flexibility needed for speedy transaction processing. This enables you to create a database that works to meet changing needs while maintaining the data reliability and availability you have come to expect, without needing to radically change your applications.
PingCAP recently worked with O’Reilly Media on a report on next-generation databases. It describes the need for a next-generation, NewSQL database and outlines the considerations for such a digital transformation. Finally, it presents an overview of the road map to digital transformation success.
Summary
Moving your business data environment to one that embraces the new digital world, and adapting to changing needs, is a key definer of success. Companies that look to the future and evolve over time are best positioned for ongoing success. With a distributed cloud-native database environment, you can contain costs while simultaneously providing your users with a database that meets their needs, now and in the future.
Spin up a Serverless database with 25GiB free resources.
TiDB Cloud Dedicated
A fully-managed cloud DBaaS for predictable workloads
TiDB Cloud Serverless
A fully-managed cloud DBaaS for auto-scaling workloads