/r/mongodb
News, articles, and interesting stuff in general about MongoDB (unofficial).
Learn MongoDB
/r/mongodb
Just built a tool to create MongoDB backups to S3 seamlessly. Shared all the details in my first Medium post.
Would love your feedback!
I am connecting to this database from my computer, the database is hosted on a Linux VPS which I bought from Hostinger. I don't know how that file got there nor do I know why it's there and I've noticed that some schemas get erased completely in my database. Any one has any idea what this is?
Hey Folks,
I currently run an m30 with about 18m documents on Atlas and make use of Atlas search (mainly text for now, but potentially vector search as well down the road).
Atlas is great, but the bill is not. It's expensive. I've thought about self-hosting or switching to a lower cost managed provider, but the thing holding me back is text search.
For those that have encountered a similar scenario, how have you all handed it?
Any advice is appreciated.
Thanks!
Hi! We recently released FerretDB 2.0 Release Candidate – a truly Open Source (Apache License 2.0) MongoDB alternative built on PostgreSQL. Version 2 is powered by the DocumentDB PostgreSQL extension by Microsoft – the same technology that powers Cosmos DB for MongoDB (vCore). On top of that, we provide some additional features. For example, v2 provides Data API – the compatible replacement for the deprecated Atlas Data API (a.k.a. "Fuck you MongoDB")
Read our announcements there: FerretDB, Microsoft. And please star on us GitHub: https://github.com/FerretDB/FerretDB
Hi everyone
I deployed MongoDB on my Dokploy server, but when I try to connect, it immediately disconnects and MongoDB Compass displays the following error: connect ECONNREFUSED
127.0.0.1:27017
, connect ECONNREFUSED ::1:27017
I've attached the container logs in the screenshot.
Additionally, I have included the database configuration screenshot from the deployment process (This compass behavior occurs regardless of the replica set status)
After setting it up, I only specify the External Port (Internet) with the value 27027 and try to connect using the connection string.
My connection string: mongodb://*login*:*pass*@*remote-ip*:27027
To rule out the fact that the problem is on my pc, I tried connecting from another pc and even from a freshly created virtual machine, but compass still gives the error connect ECONNREFUSED
127.0.0.1:27017
, connect ECONNREFUSED ::1:27017
Could you please help me understand what the problem might be?
When returning data from my TypeScript Node.js app using MongoDB, should I keep _id
as it is, or map it to id
in a DTO? Is it better to handle this in the Mongoose schema using a virtual field? What’s the best practice?
Hey guys!
I am working on a multimodal rag for complex pdfs (using a pdf rag chain) but i am facing an issue.
I recently implemented prompt caching in the rag system using langchain's MongoDBCache. The way i thought it should work is that when i ask a query, the query and the solution should be stored into the cache, and when i ask the same query again, the response should be fetched from the cache instead of LLM call.
The problem is that the prompt are getting stored into the MongoDBCache, but when i ask that same query, it is not getting fetched from the cache.
When i tried this on google colab notebook with llm invoke, it was working but it is not working in my rag system. anyone who is familiar with this issue? please help
mongo_cache = MongoDBCache( connection_string="mongodb conn str", database_name="new", collection_name="prompt_cache", )
# Set the LLM cache
set_llm_cache(mongo_cache)
Hello, need some help. Noob in Azure. I am getting a deployment failed error on Mongodb Atlas on Azure. “The resource type could not be found in the namespace 'MongoDB.Atlas' for api version '2024-11-18-preview.“
Microsoft.Template-20250130165019
I will be grateful for any help with this. Thank you!
Does anyone know why I'm getting this error and how to fix it?
Since yesterday, every time I've tried to login I get A server error occurred. Please try again in a minute.
, I've seen nothing about it online - we got notifications that our database is running out of storage so not allowing us in is currently really bad. We've tried it on multiple machines, and OSes in different locations to see if anything changes, but to no avail.
While trying to login, the network tab gives us this:
{ "errorCode": "INVALID_CAPTCHA", "message": "INVALID_CAPTCHA", "params": [], "version": "1", "status": "ERROR" }
We don't have visible captchas on the login page so not sure how we are supposed to give a "valid" captcha
Hey everyone, I've been working on a project using MongoDB in VS Code, and it was running fine for the past few days. However, today when I tried to connect to the server, I started getting this error:
"Unable to connect: connect ECONNREFUSED 127.0.0.1:27017, connect ECONNREFUSED ::1:27017"
Has anyone encountered this issue before? Any suggestions on how to fix it?
Hi guys,
I have a problem with mongodb connection to PowerBi. I downloaded connector to Bi, ODBC. Let's start from the begging: I set up local instance of mongodb with all things I needed to connection to PowerBi and it worked. I created DNS through ODBC and connected to localhost. Now I try to do the same but on server. I am at the moment where I have working instance of mongodb on server and I can connect to it through the workstation by giving in compass IP of my server and port number. I configurate mongobd.cfg bindIp to my server IP (Or should I change it for my workstation IP. I don't know if it matter but the connection is still working so... ). So I guess connection is working. But here start a fun part, when I try to do ODBC and make confing by setting name of db, giving it IP address and ports it tries to connect and crashes. I do not have any user created on mongo instance coz it was not a big deal in the localhost instance. I will be happy for any help I will get from you guys. Thanks for any help!
I am working on express backend app with Mongoose as ODM. I just want to know if is there any built-in way to import the error codes/types so that we can compare or use it something like below code snippet
import { WriteConflict } from 'mongodb'
if (errorCode === WriteConflict) {
// do someting
}
Instead of using a magic number (error code)
if (errorCode === 112) {
// do someting
using}
or creating a new file in our codebase manually and use that across applications like the below (not preferred, if there is any way to import directly from the MongoDB library)
eg: mongodb.error.ts file
export const MONGODB_ERROR_CODES = {
HostUnreachable: 6,
HostNotFound: 7,
AuthenticationFailed: 18,
NetworkTimeout: 89,
ShutdownInProgress: 91,
PrimarySteppedDown: 189,
ExceededTimeLimit: 262,
SocketException: 9001,
...
...
} as const satisfies Record<string, number>;
I have checked MongoDB native nodeJS library source code and found that error codes are not exported from the index.ts file see below links:
Other Related links:
https://www.mongodb.com/docs/manual/reference/error-codes/
P.S: MongoDB Community Forum Post
Thanks
I'm currently diving into backend development and exploring the MERN stack. I want to get a solid grasp of MongoDB, but there are so many courses out there, it's hard to choose the right one because most of them are outdated.Would you recommend any courses, tutorials, or resources (paid or free) that helped you master MongoDB?
The MongoDB developer site just posted a new tutorial about Temporal. It describes the role of Temporal in a microservices-based system, explains the basic architecture of Temporal, and then walks through the code for a example Temporal application in Java.
The tutorial assumes no prior knowledge of Temporal and should take less than an hour to complete. During the tutorial, you'll see firsthand how Temporal enables an application to automatically recover from a service outage and even a crash of the application itself, continuing on as if it never even happened. You'll also see how to use Signals to implement a human-in-the-loop use case that awaits manual approval before continuing with automated steps.
We've got a couple of apps reliant on a set of MongoDB collections, and are looking to jump over to "Amazon DocumentDB (with MongoDB compatibility)", and migrate from the (soon to be turned off) Atlas data API.
Has anybody made the leap? Any gotchas?
We make extensive use of a subset of the Atlas data API calls from embedded devices, does it make more sense to use RESTHeart or write our own lambdas to emulate just the Atlas data API calls the embedded client relies on?
I spend whole day trying to figure out how can I convert my standalone mongod to Replica Set I and have failed. This is how I'm doing it without replication.
docker-compose.yaml
volumes:
mongo.data:
services:
mongo:
image: ${COMPOSE_PROJECT_NAME}/mongo
container_name: ${COMPOSE_PROJECT_NAME}.mongo.docker
restart: unless-stopped
build:
context: .
dockerfile: ./mongo.Dockerfile
env_file: envs/mongo.env
volumes:
- mongo.data:/data/db
ports:
- 127.0.0.1:${MONGO_PORT}:${MONGO_PORT}
networks:
- network
networks:
network:
driver: bridge
mongo.Dockerfile
FROM mongo:6.0
ENV MONGO_PORT=27017
EXPOSE $MONGO_PORT
HEALTHCHECK CMD echo 'db.runCommand("ping").ok' | mongosh 127.0.0.1:$MONGO_PORT/test --quiet || exit 1
ENTRYPOINT docker-entrypoint.sh mongod --port $MONGO_PORT
And then I'm connecting with a url that is built like that: mongodb://${env('MONGO_USERNAME')}:${env('MONGO_PASSWORD')}@${env('MONGO_HOST')}:${env('MONGO_PORT')}/${env('MONGO_DATABASE')}?authSource=admin
I would appreciate if someone could help me with that.
Hi,
My mongo ops manager is not loading in the browser and I get 504 error. I also see that my process is
is exited and don't see any error in logs. The support has asked me to run db.getSiblingDB("cloudconf").config.appState.find({}).pretty()
in AppDB. where do I run this query? I am new to mongo.db
● mongodb-mms.service - MongoDB Ops Manager
Loaded: loaded (/lib/systemd/system/mongodb-mms.service; enabled; vendor preset: enabled)
Active: active (exited) since Sat 2025-01-25 03:49:04 UTC; 19h ago
Hey everyone!
I’ve been thinking of building a simple iOS widget app for anyone using MongoDB. The idea is straightforward:
This could be useful for developers, startup founders, or anyone who wants to track their user base at a glance without needing to log in to a dashboard.
Some key features I’m considering:
Would this be something you’d use? If not, what would make it more appealing? I’d love to hear your feedback or any additional feature ideas!
Thanks in advance! 🙌
I have doubts about how people showcase their backend projects without adding any frontend to them
I was looking into the new Flex plans. I understand a lot about it except one part. Is this price per cluster or organization?
I have got 5 clusters, 3 I barely use, does that mean I have to now suddenly pay $40?
I don't mind paying the amount for my whole organization and they can charge me more based on usage, but $40 base suddenly from 0 seems crazy.
Hi there!
Just recently, I've ventured into the world of databases to collect some data, and I've ran into some optimization issues. I'm wondering what would be the best course of action, so I'd like to ask here!
Here's the overview (hopefully your eyes won't bleed):
I have multiple programs collecting this kind of data:
Category: C (around 70 unique categories can occur)
From: A
To: B
(around 4000 different items can occur as both B and A)
My database is set up like this at the moment: I have a collection, and in this collection, documents are labelled with their corresponding category. So if I collect Category C, I find the document with this label.
Then, the documents have attributes and sub-attributes. When updating, I firstly look for the document with the correct category, then the correct attribute (A), and then the sub-attribute of this attribute (B), and I update it's value (the number goes up).
This is however, terribly slow after it has ran for some time. It can only process like 15 updates per second, which I'm really sad about. I don't fundamentally understand how MongoDB works, so I am having great trouble optimizing it, since I am only able to do it by trial and error.
That begs the question: How can I optimize this? I am confident there is a better way, and I'm sure some of you experienced guys can suggest something!
Thanks!
MongoDB kept returning all kinds of 8000 errors when I got to 512MB/512MB. So I deleted a bunch of documents and things starting working again.
Same thing happening again, even though I believe I’m at 330MB AND i just upgraded to M2 with 2GB.
Here are my stats:
Database Statistics:
db: test
collections: 18
views: 0
objects: 19800
avgObjSize: 339.2477272727273
dataSize: 6717105
storageSize: 162074624
totalFreeStorageSize: 0
numExtents: 0
indexes: 31
indexSize: 324276224
indexFreeStorageSize: 0
fileSize: 0
nsSizeMB: 0
ok: 1
Any idea what’s going on?
So i am learning backend from the odin project and in their curriculum they are teaching postgre sql , but I was targetting mern stack ,
So ,i wanna ask if it'll be time waste to learn postgresql and should i skip that part and move to mongo db directly?
Hi,
I’m facing significant challenges while working on a big data pipeline that involves MongoDB and PySpark. Here’s the scenario:
Prematurely reached end of stream
, causing connection failures and slowing down the process.delete_one
def delete_documents(to_delete_df: DataFrame):
to_delete_df.foreachPartition(delete_one_documents_partition)
def delete_one_documents_partition(iterator: Iterator[Row]):
dst = config["sources"]["lg_dst"]
client = MongoClient(secrets_manager.get("mongodb").get("connection.uri"))
db = client[dst["database"]]
collection = db[dst["collection"]]
for row in iterator:
collection.delete_one({"_id": ObjectId(row["_id"])})
client.close()
I will try soon to change to :
def delete_many_documents_partition(iterator: Iterator[Row]):
dst = config["sources"]["lg_dst"]
client = MongoClient(secrets_manager.get("mongodb").get("connection.uri"))
db = client[dst["database"]]
collection = db[dst["collection"]]
deleted_ids = [ObjectId(row["_id"]) for row in iterator]
result = collection.delete_many({"_id": {"$in": deleted_ids}})
client.close()
MongoPaginateBySizePartitioner
with a partitionSizeMB
of 64MB, but it still causes crashes.Thanks for your help! Let me know if you need more details.