Merging Time Series Data in Different Scenarios with PostgreSQL

By Digoal

Data merging is required in many scenarios. For example, for recorded table change details (insert, update, and delete), we need to merge details and quickly obtain the latest value of each PK based on these details.

Another example is that in some situations, we need to obtain the latest status of each of many sensors that are constantly uploading data.

We can use window queries to meet these scenario requirements. However, how can we implement acceleration and quickly obtain batch data?

To achieve this, we can make some optimizations.

Sensor Case Example

Take the previously mentioned sensor case as an example. Let’s assume that these sensors are constantly uploading data and users need to query the latest value that each sensor is uploading.

Create the following test table.

The following is the query statement:

To implement optimization, add a composite index and avoid SORT. Note that IDs require desc

SQL performance after optimization

If data to be extracted requires further processing, we can use cursors to obtain data in bulk. Since we are not required to display SORT, obtaining batch data is very fast, accelerating the overall data processing.

Before optimization, SORT is displayed, so using cursors cannot improve the performance, and the first record obtained has been sorted.

Example of Incremental Synchronization of Merged Data

When a materialized view is applied in Oracle, updating the same record only requires the last update instead of all the intermediate processes of each update.

We can use a similar method to implement grouping and obtain the last record.

Excellent Optimization Method for Sparse Columns

As we can see, the preceding optimization method only eliminates SORT and does not remove the number of blocks to be scanned.

In the event of very few groups (namely, sparse columns), we can use a more powerful optimization method — recursive queries.


Fast FETCH is still supported

The extraordinary recursive optimization enables 10 times better performance and allows 10 million records to be filtered in just 4 seconds.


Follow me to keep abreast with the latest technology news, industry insights, and developer trends.