There are three particularly common sorts of database "work" that have very different performance characteristics.
Online Transactional Processing (OLTP)
The goal, in OLTP work, is to provide feedback for transaction requests as quickly as possible. Generally, this kind of processing involves validation queries that only involve small amounts of data, such as to verify that products being purchased are legitimate products, that supplies are in stock, or that the customer has sufficient credit to cover the purchase.
Correct handling of concurrent transactions is vital,
Transactional Reporting
Commerce systems generally require having some reports on transactions, providing detailed reports used to validate the correctness of accounting records.
Analytical Reporting
Also known as "data warehousing", this involves doing queries, typically to search for patterns in sales data to find targets for improving operations.
The patterns of data accesses in these three kinds of uses of databases are very different, and often conflict, to a considerable degree. Notably, " data warehouse" applications involve many long-running queries that slow down transaction processing. In a system that is supporting customer activity, you cannot afford to have today's sales injured by analysis work that might, at best, improve things later, or that, more likely, is doing administrative analysis work on past history.
Moreover, " data warehouse" queries have a very different handling of parallelism than online systems. Online systems may have many updates taking place simultaneously, and management of that must focus on ensuring that those updates are handled consistently and quickly. In contrast, systems used for reporting are likely to be looking at past historical data, and are not likely to be updating data to any significant degree.
Ideally, these sorts of database usage should be separated to separate database instances, so that " transactional" processing takes place on a database instance configured for that purpose, while " analytical" processing takes place on systems configured for that purpose.
One thing that you'll find you need is some sort of strategy for purging out old data from the " transactional" system. Otherwise, numerous problems arise:
General database performance will suffer somewhat, as there will be a lot of data sitting in tables that isn't being used.
If the quantities of "things" being tracked increases (e.g. you gain customers, and add products or services), then the amount of " dusty old data" will increase at a greater-than-linear rate.
Backups will take increasing amounts of time, and may require extra hardware to support them.
I have been in the situation where backups had to be submitted to a third party " escrow" service, and when data wasn't being purged, this meant that steadily increasing amounts of data were being resubmitted each day, which involved:
Encrypting the data (which is CPU-intensive);
Making several copies in the process of moving it from a database server to one with public network access;
Network traffic, moving the same gigabyte of old data across the Internet each day
The escrow service had similar processes on their systems, in decrypting and validating the data preparatory to their archival activities.
The irritation in this was that we knew that most of the data, which was from the previous year, had been sent out, over and over, every day. The resources being wasted are ones that are getting cheaper all the time, but that are hardly free.
Disaster Recovery
When the amount of " old dusty data" being maintained grows, this means that recovering from a disaster takes longer and longer. It is not good if you have to report to corporate management that
It's taking us 29 hours to recover the databases, of which 3 hours was required to recover the product and inventory tables, and it then took 26 hours to recover transaction logs for the last 3 years.
Cutting that to 5 hours would be a manifestly Good Thing.
General need for additional hardware
Maintaining more data in the database requires having more disk space, more memory; in general, more hardware.
You may find that some programs rummage through all the old data, and start running longer and longer.
I once encountered a situation where an online system encountered fairly severe performance problems because a "message queue" was being maintained, but without any kind of "purge" policy.
There were customers that never checked their messages, and thus never deleted them, so that their queues were steadily being filled with increasingly stale messages. One customer was "polling" the system with message queue queries to keep connections alive, which led to steadily worsening queries on this table.
A script was written to delete all messages more than two weeks old, which made this table shrink to a fraction of its previous size, which made the performance issue go away. Memory and disk resource usage fell substantially.