/r/sonarr
Sonarr is a PVR for Usenet and BitTorrent users. It can monitor multiple RSS feeds for new episodes of your favorite shows and will interface with clients and indexers to grab, sort, and rename them. It can also be configured to automatically upgrade the quality of files already downloaded when a better quality format becomes available.
You can find us on discord at https://discord.sonarr.tv/
Welcome to the official Sonarr subreddit, please read over the rules before posting.
/r/sonarr
As I've recently gotten into the *arrs, I've encountered this strange problem with the on-going TV shows that I was monitoring. Probably the simplest way of dealing with this would be to disable all public trackers, but I was just a little too frustrated by the lack of a simple solution for this, so I've created a very small tool to help with it.
What it does:
Right now there's only a linux/amd64 docker image available and it supports multiple instances of Sonarr, and only qBittorrent is supported as a download client.
Any feedback is appreciated. Thanks!
https://github.com/flmorg/cleanuperr
Edit: Blacklisting malicious extensions in qBittorrent is an important step for this tool to work as intended.
Like the Title said.
How can i manage to filter titles with suptitles in German + English?
My indexers just updates tv series to one file when new serie is released, as i know, other indexers each episodes comes as different torrent file, is there a workaround for this? Because, when new series comes out i need to manually take a look at each of them and update manually
So I’m incredibly new to any form of this. I’ve followed this (the Dev guide by Rafael Magalhaes) guide as long as my miniature mind could comprehend. I’ve been following the Dev guide by Rafael Magalhaes almost perfectly. And now I’ve reached the ‘adding a show’ section, and when I do add one, it shows up in qbit, but it doesn’t load, it’s simply stuck on ‘downloading’.
To give some context, I only have one index. So I don’t know if that had to do with it, as I’m really new and I do not know.
The only thing I can imagine myself doing wrong would be in the Qbit section of the guide when he says to add something into Qbit, and it’s a text, the section has to do with WinRar, and I simply copied the entire text and pasted with its brackets and everything.
I’m not a very smart person, and these past few days have been very rough on me mentally, I just want to finish this. Thank you for being an awesome person. (if you help)
P.S. I apologise for my poor English, I am not native.
Any idea where can I find more movies?tv releases from t3nzin?
Im pretty new to torrenting and i wanna know some good x264/x265 releases and stuff!
I inadvertantly added a Usernet indexer prior to configuring SABnzbd and now have 3000 entries in my queue. I've since set up SABnzbd but now have these orphaned entries in the queue. I cannot delete them using the Remove selected function in the queue. Is there another way of removing them?
I have a syno from which i download. Another Syno that keeps my media lib. And a third box running ubuntu with Plex. I want to have the arr suite (movies tvshows subs) on a fourth box (raspi).
As i understand it the arr suite can “watch” for new editions of movies or shows and downloads. But i cant really figure out how to get that work in sync. Where can i read up on how all this was designed conceptually?
I hope to have my plex watchlist trigger some watching thing on arr, that would manage my downloader (ie start new torrent download etc) and find subs for the media.
Roughly, how would that work and what would i need to configure on my raspi?
Anyone else having issues with sonarr downloading the same episodes again and again and sometimes not moving them automatically
Seems like a new issue for me havnt had it before its normally been super reliable
I noticed recently. 90% of my tv show downloads fail to import. Reason? The main file is a .zipx file. Bogus or virus infected or whatever: it's not the video file one expects. I thought simply configuring QBittorrent to not download .zipx files would solve it. The download would fail, Sonarr becomes aware and tries to search another release. However, QBittorrent is quite dumb. The status of magnets/torrents that contain (only) .zipx files just changes to "seeding", while 0% of the file has been download. This also means in Sonarr the episode simply becomes "missing".
By now, it seems 100% of such downloads have "successfulCrab" in the name of the release. This is good news, because now I can tell Sonarr to never select a release with that name! Issue solved!
Only problem, I have no clue how to do that. I understand I should add a Custom Format. But then I also need to put some RegEx expression in there. I don't know how to do that. Any help would be much appreciated!
EDIT: solved. This is how: Sonarr > Settings > Profiles > scroll to "Release Profiles" > add a release profile, put "SuccessfulCrab" in the "must not contain" field. Set Indexer to Any.
Then in Sonarr > Activity select all and hit delete, choose the option to blacklist and search alternatives.
Check back an hour or so later and notice everything that was missing is now properly downloaded.
I thought I remember a view mode where you could see every series and every season and you could quickly monitor/unmonitor certain seasons. Am I making this up? Or does it still exist and I can't find it?
Thanks for your help!
Hi all,
After a little bit of help.
Noticed today that sonarr has now stopped importing completed downloads. I can manually import them via sonar by going to the completed folder and using the interactive import feature.
Nothing has changed from what I can see.
I am SABNZB and Sonarr as a service with an account so as to be able to use network paths.
I did note SABNZB was not complaining that my completed and incomplete folders were on a network drive so as a test moved them to local drives.
Sonarr still didn't import any new downloads.
The handle completed downloads is enabled (had figure out where that was, as it had been a while).
Unsure what to check as this config has worked for ages untill today.
Will provide version when I get back from the chippy😂
Been using for years without issue so I'm guessing someyhing had changed and I've not paid attention.
Edit: Realized I said “Manual Search” in the title but I meant the “Automatic Search” next to each episode.
I noticed a show (The Penguin) was missing all episodes so I clicked the "Search Monitored" and "Search Monitored Episodes in this Season" but, both did not find any results.
However, if I click "Automatic Search" for each separate episode, it will successfully find the episode and start downloading.
Is this a bug or something I'm not understanding?
Version 4.0.10.2544
So I have a system set up with unraid. I have delugevpn running and sonar set up. I am useing nzbgeek as an indexer and both appear to be in working order.
As I add items to my queue they all get stuck saying "pending - download client is unavailable. I've tried re installing/configuring both dockers to unavail.
Any insight would be great!
hi, i was downloading clone wars and sonarr found every season but them only found the first one, also happened with hunter x hunter.
i had to manually import them.
any fix?
Been using Sonarr for a year without difficulty. In last two days, something changed that I'm still trying to figure out. Honestly, I came here initially to ask whether I can reinstall Sonarr but have it remember the series contained within Sonarr before the problem occurred. Maybe an XML config file or something like that.
But first I should ask if anyone can suggest how to go about debugging my problem first. Here is how it presents:
**** Note: Need to understand the process for sharing logfiles....will edit post once I figure this out *****
The log file makes it seem like it believes it already has the series downloaded maybe? But the files aren't there and so it skips everything. May not be interpreting the log correctly of course....hence the request here. :-)
Appreciate any suggestions
Kevin
I am running plex on a synology NAS. When I add items to my watchlist in Plex, I would like Sonarr to grab the episodes available IF I have a subscription with the NETWORK that released the show. For example, I have Hulu, Netflix, Prime and all the over the air networks (ABC, CBS, NBC, Fox and PBS) and I would like those shows. But I do NOT have Apple TV, HBO, STARZ or MAX and do not want them to download.
I anticipate subscribing to those services periodically to what certain shows and would like Sonarr to download them when I have a subscription. I would obviously need to change a setting in Sonarr, but just don't know how to do it.
Is this possible?
Thanks in advance for any guidance.
Sonarr on my NAS mostly fetches nzb's. I added some BitTorrent trackers to fall back on. When I select a torrent to download in Sonarr, it automatically loads on my remote seedbox (rtorrent). Works great, but...
The episodes download into ../downloads/tv/
on the seedbox, and an rsync script running on my NAS that fetches this directory from the seedbox. The issue is that my rsync script is dumping the episodes straight into the ../Series/
directory on my NAS instead of say, ../Series/The Penguin (2024)/
.
Typically when I grab torrents, I have an RSS filter that detects which series and saves them to ../downloads/tv/The Penguin (2024)/
so when the rsync script runs, the episodes land in the corresponding series directory on the NAS.
The perfect solution would be if, in Sonarr's advanced settings of my BitTorrent download client I could use Media Management tokens in the Directory field, like ../downloads/tv/{Series TitleYear}/
but I don't think that's possible...
I'm pretty new to *arr and it's been an absolute game changer so far but I'm hoping someone with more experience can tell me how they would handle this. My next thought is to try and run a similarity match on incoming filenames for existing series directories, but that seems like overkill and probably not very reliable. FWIW this has also been an issue for Radarr albeit to a lesser extent.
I have Sonarr (and Radarr for what is worth) running in a docker within a Synology NAS, and I had to replace my HDD, and re-create the downloads and video shared folders.
Downloads are now working properly again (had to set up new bindings, permissions, etc), however, they are not being imported. They just stay in Transmission in idle status ("seeding complete").
Looking at the logs, it clears has no write permissions to the /movies folder (to the new shared folder).
My user PUID is still the same (1029) at docker level, and that user does have write permissions to all those folders. So, what should I do?
Hi everyone.
I recently started upgrading all my 720p files to 1080p. I allowed upgrade in my 1080p quality profile.
I have some shows that are in blueray 1080p and for some reason sonarr is keep downgrading them to WEB1080. I have the cutoff quality set at WEB1080 but i dont want sonarr to downgrade the blueray to web.
Thanks in advance
Just wondering if there is a way where I can set in Sonarr and Radarr where if a file is missing it gets prioritised as high, vs better quality available it gets prioritised as low when sent to Sab. Anyone got something like this working?
Hello,
I apologize but I asked this before and can't remember the solution someone gave but I thought it was so easy I couldn't forget :( -
I have two folders - TVnew - for new TV shows and a seperate one at the root of the same drive - TVold. TVnew is just shows with new episodes being released still and the old one is shows that are off the air or in rerun only - e.g. MASH. I usually only use sonarr for new shows but every now and then I like to review my old shows but there are too many to see if they are combined (for my tastes).
Anyone know how to set up a second profile on the same computer? Seems like all I had to do was flip something in the top right and I could see the second. (My computer needed a major overhaul and I moved so I've been away for a while.
I know Sonarr doesn't handle multiseason torrents. That's fine - handling them nicely is a difficult problem. But something has changed recently in what it does with those torrents.
Sonarr wouldn't import them, it just wouldn't touch them at all. Which is fine. I could just move the folders around so each season had it's own folder, or even just dump all the files right into the root download directory.
But recently, Sonarr started just completely deleting everything except the first season. Which is not fun when it's a massive show that took a long time to download.
Any idea what has changed?
How accurate is the calendar in sonarr? I noticed today that my sonarr says the new season of Wednesday starts November 23rd but I can't find anywhere online talking about it and many websites talking about it releasing sometime in 2025
I've seen some posts about the episode release time being off due to timezones but I couldn't find anything about it just straight up making releases up
I was sure this question had already been asked (and answered) somewhere but I honestly can't find anything on it! I have a vague memory of setting this up last time I installed Sonarr but now I can't find any settings for it.
Guys help me please
I had about 6tb of not perfect but watchable quality movies that I collected over the last 15 years
I recently plugged them in to sonarr to patch up incomplete shows and upgrade the quality
I’m only looking for watchable 1080p as good as what you would get on Netflix with a decent connection
I let sonarr run loose with about half of the collection and it’s ballooned now to 20tb
That’s WAY too big - looks great but I’m noticing the sizes can be random and I don’t necessarily see a big difference between a 2gb 1080p file and a 10gb 1080p file visually
For example - the smurfs started as probably 5-10gb and now it’s 212gb!
That’s WAY too much space for a decades old SD cartoon
What can I do? I am using HD 720/1080 profile and I moved the sliders WAY down before starting
What’s the slider sweet spot for quality/size? Is there something else i can do?
Thank you
UPDATE: Solved using qbitmanage with noHardlink check and sharelimit cleanup, if anyone if searching for the same answer this is the Github.
So when Sonarr grabs an upgrade torrent, is there a way to make Sonarr pause the old torrent so it gets removed by "Completed Download Handling" to make disk space?
Hey everyone, not sure when this started happening, as I've flushed the logs, but all of a sudden within the last week I can no longer reach my indexers. I've logged into my server and verified that I can reach the API for each of them outside of the Sonarr application, but I'm not sure why it's not succeeding. I even tried rotating API keys. Console output of `nc` and the logs from Sonarr shown below. I can't seem to make out what's going on.
deezy in 🌐 mediatemple in /home/deezy
❯ nc -v -w 3 api.nzbgeek.info 443
Connection to api.nzbgeek.info 443 port [tcp/https] succeeded!
deezy in 🌐 mediatemple in /home/deezy took 3s
❯ nc -v -w 3 api.nzbplanet.net 443
Connection to api.nzbplanet.net 443 port [tcp/https] succeeded!
Sonarr Logs:
[v3.0.10.1567] System.Net.WebException: Error: SecureChannelFailure (Authentication failed because the remote party has closed the transport stream.): 'https://api.nzbplanet.net/api?t=caps&apikey=(removed) ---> System.Net.WebException: Error: SecureChannelFailure (Authentication failed because the remote party has closed the transport stream.) ---> System.IO.IOException: Authentication failed because the remote party has closed the transport stream.
at Mono.Net.Security.AsyncProtocolRequest.ProcessOperation (System.Threading.CancellationToken cancellationToken) [0x0014d] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at Mono.Net.Security.AsyncProtocolRequest.StartOperation (System.Threading.CancellationToken cancellationToken) [0x000a4] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at Mono.Net.Security.MobileAuthenticatedStream.ProcessAuthentication (System.Boolean runSynchronously, Mono.Net.Security.MonoSslAuthenticationOptions options, System.Threading.CancellationToken cancellationToken) [0x00346] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at Mono.Net.Security.MonoTlsStream.CreateStream (System.Net.WebConnectionTunnel tunnel, System.Threading.CancellationToken cancellationToken) [0x001f4] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at System.Net.WebConnection.CreateStream (System.Net.WebOperation operation, System.Boolean reused, System.Threading.CancellationToken cancellationToken) [0x001f5] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
--- End of inner exception stack trace ---
at System.Net.WebConnection.CreateStream (System.Net.WebOperation operation, System.Boolean reused, System.Threading.CancellationToken cancellationToken) [0x00275] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at System.Net.WebConnection.InitConnection (System.Net.WebOperation operation, System.Threading.CancellationToken cancellationToken) [0x0015b] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at System.Net.WebOperation.Run () [0x000b7] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at System.Net.WebCompletionSource`1[T].WaitForCompletion () [0x000b1] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at System.Net.HttpWebRequest.RunWithTimeoutWorker[T] (System.Threading.Tasks.Task`1[TResult] workerTask, System.Int32 timeout, System.Action abort, System.Func`1[TResult] aborted, System.Threading.CancellationTokenSource cts) [0x00118] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at System.Net.HttpWebRequest.GetResponse () [0x00019] in <a85c1a570f9a4f9f9c3d2cfa5504e34f>:0
at NzbDrone.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookies) [0x00123] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:81
--- End of inner exception stack trace ---
at NzbDrone.Common.Http.Dispatchers.ManagedHttpDispatcher.GetResponse (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookies) [0x001c0] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\Dispatchers\ManagedHttpDispatcher.cs:107
at NzbDrone.Common.Http.HttpClient.ExecuteRequest (NzbDrone.Common.Http.HttpRequest request, System.Net.CookieContainer cookieContainer) [0x00086] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\HttpClient.cs:126
at NzbDrone.Common.Http.HttpClient.Execute (NzbDrone.Common.Http.HttpRequest request) [0x00008] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\HttpClient.cs:59
at NzbDrone.Common.Http.HttpClient.Get (NzbDrone.Common.Http.HttpRequest request) [0x00007] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Http\HttpClient.cs:281
at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider.FetchCapabilities (NzbDrone.Core.Indexers.Newznab.NewznabSettings indexerSettings) [0x000a1] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:64
at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider+<>c__DisplayClass4_0.<GetCapabilities>b__0 () [0x00000] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:36
at NzbDrone.Common.Cache.Cached`1[T].Get (System.String key, System.Func`1[TResult] function, System.Nullable`1[T] lifeTime) [0x000b1] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Common\Cache\Cached.cs:104
at NzbDrone.Core.Indexers.Newznab.NewznabCapabilitiesProvider.GetCapabilities (NzbDrone.Core.Indexers.Newznab.NewznabSettings indexerSettings) [0x00020] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\NewznabCapabilitiesProvider.cs:36
at NzbDrone.Core.Indexers.Newznab.Newznab.get_PageSize () [0x00000] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\Newznab.cs:24
at NzbDrone.Core.Indexers.Newznab.Newznab.GetRequestGenerator () [0x00000] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\Newznab\Newznab.cs:28
at NzbDrone.Core.Indexers.HttpIndexerBase`1[TSettings].TestConnection () [0x00007] in C:\BuildAgent\work\63739567f01dbcc2\src\NzbDrone.Core\Indexers\HttpIndexerBase.cs:336
2024-11-05 12:24:35.0|Warn|SonarrErrorPipeline|Invalid request Validation failed:
-- : Unable to connect to indexer, check the log for more details
If anyone has any insight it would be appreciated!
I've noticed quite a number of paid indexers (geek, ninja, drunkenslug) still have a number of failures with automated downloaders. They eventually find a legit one but it would seem sensible to allow for sonarr to report back that it was dead or missing too many articles.
Does this feature exist?
Granted it would require an API on the indexer to receive these reports so it's not just a sonarr problem.
Seems it would also help indexers to be clear of the dud ones.
Not sure whether this is a sonarr issue or an nzbget issue... multiple episodes downloading; they don't go into the series folder, (all the episode files just dump into my 'TV Shows' folder; Sonarr doesn't show them as completed downloads... any ideas?
Hello,
I'm about to reformat my hard drive (not where the TV shows are stored in Sonarr but has the sonarr program installed. Is there a way to backup or export my database and import it when the new image is set up. I know there were a lot of shows I had to tweak with Sonarr and would prefer to just copy the database since the location of the TVshows won't change. I'd prefer not to scan the hard drive and set up the db from scratch.