/r/PostgreSQL

Photograph via snooOG

The home of the most advanced Open Source database server on the worlds largest and most active Front Page of the Internet.

/r/PostgreSQL

53,031 Subscribers

1

Facing issues while configuring read-replica for PostGres

Hey All!

I am very new to postgres and I was trying to setup postgres with its read-replica configuration on Ubuntu Machine. I was doing this with help of Ansible. I installed the PostGres V14 but its throwing error while creating replication user, I'm not able to understand why this error is coming. Im attaching SS for the error msg along with the code for ansible for creating the replication user.

https://preview.redd.it/dsbr12lq5xge1.jpg?width=2940&format=pjpg&auto=webp&s=4ac0381b7daac5d8cb52ba6ebbd1a3505abec33a

Github - https://github.com/Hemendra05/postgres-as-a-service/blob/main/ansible/roles/primary/tasks/main.yml

Code:

- name: Create replication user

shell: sudo -u postgres psql -c “CREATE USER {{ replication_user }} REPLICATION LOGIN ENCRYPTED PASSWORD ‘{{ replication_password }}’”

- name: Configure primary for replication

lineinfile:

path: /etc/postgresql/{{ postgresql_version }}/main/postgresql.conf

regexp: "^#?wal_level ="

line: "wal_level = replica"

notify: Restart PostgreSQL

- name: Allow replication connections

lineinfile:

path: /etc/postgresql/{{ postgresql_version }}/main/pg_hba.conf

line: "host replication {{ replication_user }} {{ item }} md5"

loop:

- "{{ hostvars['replica1']['ansible_host'] }}/32"

- "{{ hostvars['replica2']['ansible_host'] }}/32"

notify: Restart PostgreSQL

1 Comment
2025/02/03
12:39 UTC

1

I recently faced an issue in production whereby the host crashed due to storage issues and subsequently caused postgres to crash. Trying to restart, we experienced the dreaded Stale pidfile Syndrome; the postmaster.pid file still existed. Efforts to delete the file were futile. The DB runs on docker

1 Comment
2025/02/03
11:46 UTC

1

Choose the Best Database Management Tool for PostgreSQL in 2025

In this detailed guide, we will explore the importance of selecting the right database management tool for PostgreSQL, especially in a team environment. The right tool provides an intuitive interface for visualizing data, streamlining SQL query writing, and enabling data synchronization but also facilitates team collaboration. It allows multiple users to work together efficiently, share project progress, generate detailed documentation, and ensure smooth database management, all while maintaining a user-friendly experience for both individual and group workflows.

The Best PostgreSQL Database Management Tools for 2025

DbSchema

  • Stands out for its visual schema design and synchronization, offering an intuitive GUI that is perfect for teams managing PostgreSQL databases.
  • It also offers interactive diagrams and offline design capabilities.

pgAdmin

  • Is specifically designed for PostgreSQL database management, offering a robust set of tools tailored for database administrators and developers.
  • Provides deep integration and advanced features such as server monitoring, backup and restore, and comprehensive user management.

Navicat

  • Is known for its user-friendly interface and powerful administration features, including data synchronization and cloud integration.
  • It’s perfect for seamless database management and secure connections.

DbVisualizer

  • Is a cross-platform tool with a robust feature set, including a powerful SQL editor and performance analysis tools.
  • It supports multiple operating systems and provides extensive database management capabilities.

DBeaver

  • Is an open-source database management tool that supports a wide range of databases, including PostgreSQL.
  • It offers a user-friendly interface, extensibility with plugins, and strong community support.

dbForge

  • Provides advanced tools for PostgreSQL development and administration, featuring a comprehensive SQL editor and robust data comparison and synchronization capabilities.
  • It’s ideal for secure and efficient database management.

Main Topics

  1. PostgreSQL Integration
  2. Download and Installation
  3. Connecting to a PostgreSQL Database
  4. User Interface and Design Overview
  5. Querying and Reporting
  6. Security and Scalability
  7. Overview of Pricing Models
  8. Conclusions: Which Tool is Best for PostgreSQL?

PostgreSQL Integration

Before using any of these tools, you need to have PostgreSQL running. Install it following this link or by using Docker: <mark>docker run --name postgres -e POSTGRES_PASSWORD=mysecretpassword -d -p 5432:5432 postgres</mark>

Download and Installation

If you need to manage multiple databases alongside PostgreSQL, consider tools like DbSchema, DbVisualizer, and DBeaver, which support a broader range of database systems in a single platform, but if you're working exclusively with PostgreSQL, pgAdmin, Navicat and dbForge are ideal options, as they offer dedicated installation packages specifically for PostgreSQL.

DbSchema - Developed with Java

  • Download the installation package from here, available on Windows, macOS, and Linux.
  • No email is required, offers 15 days free trial.

pgAdmin - Developed with Python and JavaScript

  • Download the installation package from here, available on Windows, macOS, and Linux.
  • Free and open-source, no trial or email required.

Navicat PostgreSQL - Developed with C++

  • Compatible exclusively with PostgreSQL on Windows, macOS, and Linux. Download it from here.
  • No email is required. Activate the 14 days free trial.

DbVisualizer - Developed with Java

  • Download it from here, available on Windows, macOS, and Linux.
  • An email adress is requested to activate the 21-day free trial.

DBeaver - Developed with Java(Eclipsed-based platform)

  • Download it from here, available on Windows, macOS, and Linux or with an Eclipse Plugin.
  • An email adress is requested to receive the 14-day trial licence. A link will be sent to you to set a password, wich automatically creates an account to the website. There you have the license key for application activation.

dbForge for PostgreSQL - Developed with Delphi

  • Compatible exclusively with PostgreSQL on Windows, macOS, and Linux. Download it from here.
  • Supports PostgreSQL versions up to 16. PostgreSQL 17 is not supported.
  • An email adress is requested to activate the 30-day free trial.

Connection Steps to a PostgreSQL Database

All of these tools simplify the process of connecting to PostgreSQL. Let me walk you through their Connection Dialogs:

DbSchema - Uses JDBC drivers

  • Choose the PostgreSQL database, select the JDBC URL that is automatically downloaded by DbSchema, then insert the authentication credentials.
  • More Info: DbSchema Documentation

Dbschema connect

pgAdmin -Uses libpq and Psycopg2

  • Server Connection: Requires a connection to a PostgreSQL Server to work on a database,you can't work offline or design a schema without connecting to a database instance. pgAdmin connect

Navicat PostgreSQL

  • The connection dialog simplifies input of server details and authentication credentials.

Navicat connect

DbVisualizer - Uses JDBC drivers

  • Choose the PostgreSQL database, then insert the authentication credentials.

DbVisualizer connect

DBeaver - Uses JDBC drivers

  • Choose PostgreSQL from the database list, then enter your authentication credentials to connect.

DBeaver connect

dbForge

  • Create a new connection, enter your credentials, and select the desired database.
  • You can choose the working environment to match your workflow.

dbForge connect

User Interface and Design Overview

DbSchema

Visual Schema Design: Provides an intuitive graphical interface for designing and managing database schemas. You can visually create tables, relationships, and constraints, making it easier to understand and modify the database structure.

Example: Use the visual editor to drag and drop tables and relationships, automatically generating the corresponding SQL scripts. You can also leverage the query builder to construct complex queries without writing SQL manually.

DbSchema interface

pgAdmin

User Interface: pgAdmin’s Dashboard provides an overview of the server's performance, query statistics, and more. It displays data visually in the form of charts and graphs, but this is more focused on server monitoring than schema design.

Schema Design:: The Entity-Relationship Diagram (ERD) tool in PgAdmin provides a graphical representation of database tables, columns, and their inter-relationships. While it may not be as intuitive as some other tools and lacks drag-and-drop functionality, it offers a robust set of features for database design and visualization.

pgAdmin interface

Navicat PostgreSQL

Ease of Use: While Navicat is marketed for its user-friendly interface, as a first-time user, it might not feel as intuitive as expected. However, once you familiarize yourself with the application, it does simplify database administration tasks. It features a powerful query builder, data modeling tools, and efficient data transfer capabilities, making it a valuable tool for managing databases.

Example: First, you have to create a New Model Workspace. When working with an existing database, you need to reverse engineer the database into a model, then, click on Existing Model Objects to view the database tables. From there, you can drag and drop the tables onto the diagram for easy visualization.

Navicat interface

DbVisualizer

User Interface: Provides a tab-based user interface that allows you to manage database objects, but it doesn’t support drag-and-drop for adding tables to the diagram and.

Example: Users can manage database objects through the Database Object Tree and write SQL scripts in the SQL Commander, which offers syntax highlighting and auto-completion.

DbVisualizer interface

DBeaver

User Interface: It offers a visual editor for creating and managing database schemas, including tables, relationships, and constraints.

Example: Example: In the ER diagram, users can view and interact with database tables and their relationships visually.

DBeaver interface

dbForge

User Interface: Offers a clean, customizable interface with easily rearranged toolbars and panels. It features a start page for quick access to recent projects, and its SQL editor.

Example: Database Diagram tool allows for intuitive drag-and-drop schema design, enabling users to visually create, modify, and manage database structures, where relationships between tables are easily represented.

dbForge interface

Querying and Reporting

Each tool provides a robust querying environment, while reporting capabilities vary in terms of automation and output formats. Tools like Navicat and dbForge offer built-in report builders, while others like DbSchema and DBeaver rely more on query result exports for reporting.

  1. DbSchema Querying:

Offers a built-in SQL editor that supports query building and execution. It includes auto-completion and query validation, which makes querying efficient. The visual query builder allows users to drag and drop tables, simplifying complex query creation without writing SQL manually

Reporting: Allows exporting query results in formats like CSV or Excel for basic reporting. It also can export database diagrams, relationships, and diagrams as interactive HTML5 documentation, making it stand out from most other tools. This is a key feature used for collaboration and sharing schema designs in an accessible format.

  1. pgAdmin Quering:

Includes a robust Query Tool that supports advanced SQL query execution with features like syntax highlighting, auto-completion, and error detection. It allows users to write, execute, and save SQL queries directly within the interface, making it efficient for database management and development tasks.

Reporting: Does not have a built-in report designer, it allows users to export query results in various formats such as CSV, Excel, and JSON.

  1. Navicat for PostgreSQL Querying:

This tool includes a powerful SQL Builder that lets users create complex SQL queries without needing to manually write code. It offers syntax highlighting, auto-completion, and the ability to run queries directly in the interface.

Reporting: Has a built-in Report Builder that allows users to design, customize, and generate reports based on their query results. The reporting tool supports exporting to different formats such as HTML, PDF, and Excel, making it highly versatile.

  1. DbVisualizer Querying:

Offers an advanced SQL editor that includes features such as auto-completion, syntax highlighting, and query history. It supports running multiple queries in parallel and provides result grid views for easy data exploration.

Reporting: Allows users to export query results directly to several formats (CSV, JSON, XML, etc.) for further reporting. Although it doesn’t have an in-built report designer, it offers a high degree of customization in query execution and exporting, which can serve reporting purposes.

  1. DBeaver Querying:

Has a highly flexible SQL editor with features like code completion, error highlighting, and multiple result sets. It supports running complex queries and scripts efficiently and allows the user to visualize query plans for performance optimization.

Reporting: Doesn’t have a dedicated reporting tool but allows users to export query results into various formats, including Excel, CSV, and HTML, which can be used for external reporting. Its customizable dashboards and SQL execution history also help with query-based insights.

  1. dbForge Studio for PostgreSQL Querying:

Includes an advanced SQL editor with powerful query-building capabilities. The visual query builder allows users to create SQL queries without writing code, and the editor supports error checking, syntax highlighting, and code auto-completion.

Reporting: Comes with a built-in reporting tool that allows users to create complex reports from their data. The report builder is highly customizable, supports various output formats, and allows scheduling for automated reporting. It’s particularly useful for generating data-driven insights from PostgreSQL databases.

Security and Scalability

Security It is crucial to understand the scalability and security of database management tools when making your choice, as these factors significantly impact your database's performance, growth potential, and overall data protection.

  1. DbSchema

Security: Provides a robust security framework, allowing for user management with role-based access control. This ensures that sensitive data is only accessible to authorized personnel. Additionally, it supports connection encryption, which protects data in transit, making it a suitable choice for enterprises that prioritize data security.

Scalability: Is designed to handle large-scale databases efficiently, supporting schema synchronization and versioning. This feature is essential for managing database changes as your data grows, particularly for environments with more than 10,000 tables. Its visual design tools also facilitate easier navigation and management of complex database structures, allowing teams to scale their database architecture seamlessly.

  1. pgAdmin

Security: Offers security features that includes role-based access control and user authentication to ensure that only authorized users can access and modify data. Additionally, PgAdmin supports SSL encryption for secure data transmission, safeguarding data integrity and confidentiality during transit

Scalability: Is equipped to handle large-scale PostgreSQL databases efficiently. It provides advanced tools for server monitoring, performance tuning, and automated backups, which are crucial for maintaining database performance and reliability as the database grows.

  1. Navicat for PostgreSQL

Security: Implements security measures such as SSH tunneling and SSL connections, ensuring data is encrypted during transmission. It also features user authentication and access control mechanisms to safeguard sensitive information.

Scalability: Supports scalability through features like data transfer, synchronization, and backup, which help maintain data integrity and performance as databases grow in size and complexity.

  1. DbVisualizer

Security: Emphasizes security with user authentication, permissions, and support for encrypted connections, ensuring that sensitive database interactions are secure.

Scalability: Designed for performance, DbVisualizer manages large datasets efficiently and provides tools to optimize query performance, making it suitable for scaling database applications.

  1. DBeaver

Security: Offers comprehensive security features, including user authentication and encryption for data at rest and in transit, protecting against unauthorized access.

Scalability: With capabilities for handling large databases, DBeaver supports data import/export and schema comparison, facilitating scalability in database management.

  1. dbForge Studio for PostgreSQL

Security: Includes user access management and supports encryption for data, providing a secure environment for database operations.

Scalability: The tool offers robust features for database synchronization and data comparison, helping organizations scale their databases efficiently as their needs grow.

Overview of Pricing Models <a name ="pricing"></a>

Pricing When choosing a database management tool, understanding the pricing models is essential. While each of these tools offers a one-time purchase option, it’s important to consider the features provided in relation to your organization's specific needs.

DbSchema

Cost: $294 + taxes for a perpetual license. Trial: Offers a 15-day free trial.

pgAdmin

Cost: PgAdmin is an open-source tool and is available for free.

Navicat for PostgreSQL

Cost: $229.99 for a standard license or $1599 for the Premium Edition. Trial: Provides a 14-day free trial.

DbVisualizer

Cost: $229 for a license. Trial: Available with a 21-day free trial.

DBeaver

Cost: $250 for a standard license. Trial: 14-day free trial included.

dbForge Studio for PostgreSQL

Cost: $160 for PostgreSQL or $769.95 for the Edge edition. Trial: 30-day free trial available.

Conclusions: Which Tool is Best for PostgreSQL? <a name ="conclusions"></a>

Choosing the right database management tool for PostgreSQL involves evaluating the strengths and weaknesses of each option. By considering factors such as user experience, feature sets, and cost, you can select a tool that best aligns with your organization's needs and enhances your database management efficiency.

Read the original article here

1 Comment
2025/02/03
11:41 UTC

3

Would you use PG as a triple-store?

I'm scoping a pet project with a graphlike dataset that changes over time. I was looking at various graph dbs and triple stores, but I just trust PG more, and I feel like it gives me a lot of extensibility if parts of the data end up being tabular.

I'm thinking something like this:

CREATE TABLE rdf (
     subject INT UNSIGNED NOT NULL,
     object INT UNSIGNED NOT NULL,
     predicate TEXT,
     ts TIMESTAMP DEFAULT NOW(),
     UNIQUE (subject, object, predicate)
);

-- create some indices? 

CREATE TABLE nodes (
     node SERIAL PRIMARY KEY,
     ts TIMESTAMP DEFAULT NOW()
);

-- later...
CREATE TABLE node_meta_data (
     node INT UNSIGNED PRIMARY KEY,
     ts TIMESTAMP DEFAULT NOW(),
     something TEXT,
     something_else TEXT,
     how_many INT
);

Questions:

Do I need to add more indices? queries could be based on any combination of subject object and predicate, but I expect the most common will be subject+predicate and object+prodicate. Is this the kind of thing I just have to wait and see?

In theory I can implement graph algos in recursive RTEs... how much will this hurt?

6 Comments
2025/02/03
11:01 UTC

1

Debian package creation

Hi,

does anyone know, where the script for creating the postgresql .deb packages are located? I want to create debian packages from the source code and don't want to do it from scratch. Thx

2 Comments
2025/02/03
09:16 UTC

2

PostgREST JWT actions.

Hello, PostgreSQL user, and experts, I'm beginner of PostgREST, and want to know about JWT authentication.

As I know, I can use PGJWT extension for JWT authentication, such as sign and verification.

But what I want to know is little different.

Is it possible to add user_id in payload to data?

For example,

If client sends request below,

curl --get address/post?regdate=gte.2025.01.01 \
-H "Authentication: Bearer jwt(header.{ "user_id": 10, "role":"user" }.sign})

I want request above to work as same as request below.

curl --get address/post?regdate=gte.2025.01.01&user_id=eq.10 \
-H "Authentication: Bearer jwt(header.{ "user_id": 10, "role":"user" }.sign})

and

--post address/post \

-H "Authentication: Bearer jwt(header.{ "user_id": 10, "role":"user" }.sign}) \

-d { "title": "Title", "content": "I want to know it...TT" }

as same as

--post address/post \

-H "Authentication: Bearer jwt(header.{ "user_id": 10, "role":"user" }.sign}) \

-d { "title": "Title", "content": "I want to know it...TT" , "user_id": 10}

How can I do this?

4 Comments
2025/02/03
05:41 UTC

0

Postgres alternative to MongoDB

Hey everyone!

I come from the MERN stack and I wanted to create a personal app working with Postgres. Generally for Mongo I just create a new Cluster in Atlas and am able to access it anywhere for free (with limits of course), but I habent found a way of doing the same thing with Postgres. I either create a local db or have to pay to have an instance. How do you guys do this?

Thank you!

14 Comments
2025/02/02
15:48 UTC

0

jsonb subscripting - index issue

When updating, column['key_example']['1'] = 'a' and column['key_example'][1] = 'a' are the same thing- if 'key example' doesn't exist, it creates an array and places 'a' at the first index in both cases.

How can I make it create an object with the key 'key_example', with the value 'a' instead?

And for the love of god please no jsonb_set- unless it can set that field without turning my entire column to a null value for some ***** reason when the previous key doesn't exist.

Explained visually, I have:

UPDATE table SET column['key_example']['1'] = to_jsonb('a')

I want:

{'key_example': {'1': 'a'}}

Instead, I get:

{'key_example': [null, 'a']}    
9 Comments
2025/02/01
21:50 UTC

1

Want to know more about how PostgreSQL stores your data and how you can clean up old tuples?

1 Comment
2025/02/01
16:33 UTC

0

Connection refusal

Just installed postgreSQL v17.2-1 (pgAdmin4) on an m1 MacBook Air running Sequoia 15.3. Worked upon installation. Rebooted - worked again. Rebooted yet again and got this: “Connection failed: connection to server at “127.0.0.1” port 5432 failed: could not receive data from server: Connection refused.”

  1. how do I resolve this?
  2. why was the connection made the first two times but not the third?

Thank you

2 Comments
2025/02/01
11:45 UTC

54

pgAssistant released

Hi r/PostgreSQL!

I'm excited to share that we just released pgAssistant v1.7.

PGAssistant is an open-source tool designed to help developers gain deeper insights into their PostgreSQL databases and optimize performance efficiently.

It analyzes database behavior, detects schema-related issues, and provides actionable recommendations to resolve them.

One of the goals of PGAssistant is to help developers optimize their database and fix potential issues on their own before needing to seek assistance from a DBA.

🚀 AI-Powered Optimization: PGAssistant leverages AI-driven language models like ChatGPT, Claude, and on-premise solutions such as Ollama to assist developers in refining complex queries and enhancing database efficiency.

🔗 GitHub Repository: PGAssistant

🚀 Easy Deployment with Docker: PGAssistant is Docker-based, making it simple to run. Get started effortlessly using the provided Docker Compose file.

Here are some features :

  • On a slow & complex query, pgassistant can provide to ChatGPT or over LLM(s), the query, the query plan, the DDL for tables involved in the query and ask to optimize the query. The LLM will help you by adding some missing indexes or rewrite the query or both ;

  • pgAssistant helps to quickly indentify the slow queries with rank queries (This SQL query accounts for 60% of the total CPU load and 30% of the total I/O load).

  • pgAssistant is using pgTune - PGTune analyzes system specifications (e.g., RAM, CPU, storage type) and the expected database workload, then suggests optimized values for key PostgreSQL parameter and give you a docker-compose file with all tuned parameters

  • pgAssistant helps you to find and fix issues on your database : missing indexes on foreign keys, duplicate indexes, wrong data types on foreign keys, missing primary keys ...

I’d love to hear your feedback! If you find PGAssistant useful, feel free to contribute or suggest new features. Let’s make PostgreSQL database easy for dev Teams !

8 Comments
2025/02/01
08:09 UTC

0

Issue connecting data to a database, why can't I my computer file the path ?

This might be a stupid question but on my windows I downloaded fake data and I'm trying to connect the data and add it to my database. When I use the command: \i "F:\MOCK_DATA.sql" after connecting to my database I still get the error: No such file or directory, when it does. I even made a copy of the file and ran it in the same path (my f drive) as postgresql and I';m still getting errors. Why cant my computer find the file ?

6 Comments
2025/02/01
00:47 UTC

16

Seeking Advice on PostgreSQL Database Design for Fintech Application

Hello

We are building a PostgreSQL database for the first time. Our project was previously working on MSSQL, and it’s a financial application. We have many cases that involve joining tables across databases. In MSSQL, accessing different databases is straightforward using linked servers.

Now, with PostgreSQL, we need to consider the best approach from the beginning. Should we:

  1. Create different databases and use the Foreign Data Wrapper (FDW) method to access cross-database tables, or
  2. Create a single database with different schemas?

We are looking for advice and recommendations on the best design practices for our application. Our app handles approximately 500 user subscriptions and is used for fintech purposes.

correction : sorry i meant 500K user

22 Comments
2025/01/31
22:21 UTC

8

Is there another alternative for LIKE or is like fast enough ?

Hello,

I have this satement

concat(LOWER(firstname) , ' ' , LOWER(lastname)) LIKE '%'||$2||'%' 

Want to ask you is there a better option then LIKE for this ?

and how should my index be when using LIKE with this ?

16 Comments
2025/01/31
14:26 UTC

3

How are AND expressions evaluated?

Hi,

I have discovered a weird case that I do not really understand: I have a table with a json column which contains objects with mixed values. So a property could be a string or number or whatever.

Therefore I use the following query for numbers:

SELECT *
FROM "TestEntity"
WHERE
    ((json_typeof("Json"->'mixed') = 'number' AND ("Json"->>'mixed')::numeric > 5));

but this does not work for booleans. I get the following error: ERROR: invalid input syntax for type boolean: "8"

SELECT *
FROM "TestEntity"
WHERE
    ((json_typeof("Json"->'mixed') = 'boolean' AND ("Json"->>'mixed')::boolean = true));

It seems for me that in the second case the right side is also evaluated. I changed it to

SELECT  *
FROM "TestEntity"
WHERE
    (CASE WHEN json_typeof("Json"->'mixed') = 'boolean' THEN ("Json"->>'mixed')::boolean = true ELSE FALSE END);

But I don't get it.

7 Comments
2025/01/31
14:14 UTC

0

Problems with PostgreSQL on portainer.

Hello everyone, I'm new on postgres and portainer and don't know a lot of things, if anyone could help me I will appreciate a lot!

I'm having this problem with the Postgres container:

PostgreSQL Database directory appears to contain a database; Skipping initialization

2025-01-31 10:06:02.092 -03 [1] FATAL: could not write lock file "postmaster.pid": No space left on device

Anyone knows how can I fix without losing data? I'm using Dify and N8N, and I can't remember if I backed up my work.

5 Comments
2025/01/31
13:34 UTC

0

Where is my error, psycopg2 and variable filled insert statement.

Hello, I am using psycopg2 with python to insert information into a database. Somehow, i am not seeing my mistake after working on this for a while. Data is not being entered into database.

Below is my code,

conn = psycopg2.connect(

database="postgres",

user='netadmin',

password='*****',

host='x.x.x.x',

port='5432'

)

for x in result:

try:

cursor = conn.cursor()

snmpname = x.split()[0].replace('"','')

snmpoid = x.split()[1].replace('"','')

command = "snmptranslate " + snmpoid + " -Td"

process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, text=True)

output, errors = process.communicate()

output = output.split('"')

mydata = "('"+filename+"','"+snmpname+"','"+snmpoid+"','"+output[1]+"');"

print(myInsert,mydata)

cursor.execute(myInsert+mydata)

conn.commit()

if connection:

cursor.close()

except:

nothing = 1

This all outputs a string that should be sending

"

INSERT into "public"."mibs-loaded" ("Mib-File", "mib-name", "mib-OID", "mib-description") VALUES ('IF-MIB','zeroDotZero','0.0','A value used for null identifiers.');

"

Did not want the quote as reference of the command being sent

as one example. I know if I paste that into psql it works no problem.

5 Comments
2025/01/30
21:02 UTC

1

Powering a GraphQL batched paginated subgraph query

I realize my question isn’t really practical, but it’s a curiosity for me at this point.

I have a GraphQL app where each returned type can have paginated sub queries for a few fields. Imagine answering the question, “find me the 5 most recent purchases for each person in this list”.

I whipped up a query that uses a lateral join and it works, but it’s slower than I expected when the data set is large (1k people each with 1k purchases). The default GraphQl behavior of sending a separate query for each person is somehow faster.

Anyone have any tips for this kind of query? Thanks!

2 Comments
2025/01/30
20:27 UTC

1

Preserving replication slots across major Postgres versions - PostgreSQL high availability for major upgrades

Check out this blog (the third in the series), where expert Ahsan Hadi presents yet another new feature in the PostgreSQL 17 release: enhancement to logical replication functionality in PostgreSQL. You will also receive a small script that demonstrates how to use this feature when upgrading from Postgres 17 to a future version. Learn more and read the full blog today! https://hubs.la/Q0347ymY0

1 Comment
2025/01/30
19:08 UTC

6

New to PostgreSQL and want to better understand how transactions work at a low level?

https://stokerpostgresql.blogspot.com/2025/01/a-second-step-into-postgresql.html

I am writing a series on PostgreSQL internals for those seeking a better understanding of what happens at a low level.

1 Comment
2025/01/30
17:27 UTC

1

Issues configuring dbgen package from TPC-H on PostgreSQL

Hi guys! I’m new here. I need to prepare a project for my DB management exam. As the title says, i have some issues configuring makefile.suite for generating data with dbgen package on my psql client. I have several dumb questions to ask if anyone could help me i’ll be very grateful.

5 Comments
2025/01/30
15:52 UTC

1

SQL TRIGGERS Explained For Beginners And HOW TO Use Them

1 Comment
2025/01/30
15:39 UTC

2

Help with tuning fulltext search

I'm trying to speed up fulltext search on a large table (many hundred million rows) with pre-generated TSV index. When the users happen to search for keywords with very many appearances, the query becomes very slow (5-10 sec.).

SELECT id FROM products WHERE tsv @@ plainto_tsquery('english', 'the T-bird') LIMIT 100000;

The machine has plenty memory and CPU cores to spare, but neither increasing WORK_MEM nor max_parallel_workers_per_gather nor decreasing the limit eg. to 1000 had any significant effect.

Re-running the query doesn't change the runtime, so I'm pretty confident the data all comes from cache already.

Any hints what to try ?

The one thing I did notice was that plainto_tsquery('english', 'the T-bird') produces 't-bird' & 'bird' instead of just 't-bird' which doubles the runtime for this particular query. How could I fix that without loosing the stop word removal and stemming ?

7 Comments
2025/01/30
15:04 UTC

0

How to properly verify an international name column by using a domain with regex?

Hi,

I want to create a domain for my name-columns, where I check against "Unicode character class escape"

An example Regex: https://regex101.com/r/iY7iJ6/2

It seems to be unsupported by PostgreSQL and I want to know how to implement an alternative solution. Probably a perl-function which supports the regex-classes?

I want to support all / most kind of names (accents, special chars...).

Thanks.

11 Comments
2025/01/30
14:15 UTC

0

Issue with pgAdmin 4: How to Truncate Tables Before Restoring a Backup?

Hi everyone,

I'm trying to create a backup of one or more tables in pgAdmin 4 and, when restoring them, I want to enable an option to truncate the tables before restoring the data.

However, I can't find any option to specify this behavior during the backup creation. During the restore process, there is a "Clean before restore" option, which seems like it could do what I need, but it conflicts with the "Only data" option, which is enabled and cannot be disabled.

Of course, I could manually truncate each table before restoring, but I’d prefer an automated solution rather than doing it manually.

Has anyone found a way to achieve this?

Thanks in advance for any help!

3 Comments
2025/01/30
09:44 UTC

Back To Top