/r/SQLServer
Microsoft SQL Server Administration and T-SQL Programming including sql tutorials, training, MS SQL Server Certification, SQL Server Database Resources.
Microsoft SQL Server Administration and T-SQL Programming including sql tutorials, training, MS SQL Server Certification, SQL Server Database Resources.
You might also be interested in:
/r/SQLServer
i lost all the rows from 2 of the tables in my database. no foreign keys are set and only linked by a view, not linked even on the application.
is there any way to find why it happened?? cant find anything on the sql server logs.,
i checked the application and the table is used in only 2 places and doesn't have any query to delete all the records in the whole table.
One of my teams has been using SQL ML with Open R for years. A recent change in the solution dataset has resulted in a growth problem. The dataset is what it is as the one that affects this equation the most is external.
This instance runs on Standard Edition. It's a basic app, only a few folks use it, and its outputs are primarily influence (it's analytics after all, the app provides guidance on modest input datasets against models). It uses R to perform the analytic work. The instance is SQL Server 2016 with latest SP/CU.
Now that our dataset has changed, we're experiencing out of memory errors from R, specifically presented as
"Message":"An external script error occurred: \nError: cannot allocate vector of size 7.8 Gb\n\nError in ScaleR. "
I've opened a case, I've read the documentation. The ultimate problem is the halfassed way SQL ML is implemented. R statements are self-contained configurations, they can include operating variables. In the case of SQL Server, memory and CPU constraints are injected by SQL Server at runtime. To make this incredibly confusing, Microsoft implemented these controls with Resource Governor and also set a default CPU% of 100 and a default RAM% of 20. If you have a SQL Standard instance with 16 sockets with 1 core per socket, SQL Server will only use 4 sockets but R has no awareness of that and it'll get all 16 because the Resource Governor configuration will only use CPU%.
We can no longer operate under 20%, we grew from 32GB RAM to 64GB RAM and still find ourselves failing. The SQL server instance only needs about 16GB to satisfy the buffer pool, so we're already deep in waste chasing this.
Has anyone else tried any workarounds other than buying Enterprise Edition to leverage Open Source software? lol
If I provision an SQL database from services like DigitalOcean, Linode, Vultr, or AWS, and obtain the connection string, would that database be considered publicly unsafe, even though it requires a username and password for access? Additionally, if I use the connection string in my desktop app, is that okay or not? Do I need to secure it somehow from unauthorized access? What security measures should I take to ensure it's safe?
Thanks for your insights!
My previous post about historic license increases made me think of something...
Since 2022 requires SA for virtual installations, what are the rules with downgrade rights. If I buy any license now it will be to 2022 version, but can I run 2016-2019 virtually, without needing SA?
Or does the SA requirement also apply to downgrade rights?
On our Azure SQL database we grant permissions to users through an Entra group. This is an Azure only environment. This set up has been working well until today when we added a new service principal to the group that grants DB permissions. That new Entra Service Principal cannot access the database. I have the DB set up to send logs to Azure blob storage, I reviewed the logs and it very clearly tells me that we're getting a login failed message for this new SP. This error is documented here. State 5 indicated that the user id isn't valid. Other users in the same group are able to access the DB. When I grant the SP direct access (no Entra group) it is able to access the DB.
additional_information <login_information><error_code>18456</error_code><error_state>5</error_state></login_information>
Has anyone seen this before where Entra group membership does not work to allow access to the database? Again it's an Azure SQL DB. It's a copy that we recently made from our DEV DB for load testing purposes. I have dropped and recreated the database user entirely. This is set up as a contained database user and the app is specifying the correct DB name in the connection string. Any help is appreciated.
Edit: I was able to fix this by turning off the system assigned managed identity for the app and then turning it back on. I then had to add that identity back to the Entra group in order to give it DB permissions.
Hi All, I am working on a query where I have items ranked by date and I want to return the name of the following item in the rank. Is there a simple way to do this?
Ex: {Item}, {Rank}
Item A, 1
Item B, 2
Item C, 3
Item D, 4
Desired output:
{Current_Item}, {Next_Item}
Hey all I’m new to sql and trying to learn some things. At work we have outbound for every month of the year. What I’ve done is added all the outbound monthly excel files into sql (well over a million) how would I go about adding each months states? For example adding January through mays date and adding up the state colum. When I try to do it in a quary I don’t get any results. Thanks
So this is new for me I never used such reports before. In company I work in we use this SSRS mainly on Desktop and laptop devices. Funny thing is that I can't load any report on Android phone or tablet but easily can on 55" Smart TV with exact same and latest Chrome app version.
What is the case here? SSRS can't handle to render report on smaller screens? It refreshes over and over again every 3 second or so. I tried with Desktop version too on couple devices. I tried other browsers like Opera and Edge for Android, no way to make reports load.
NOTE: I CANNOT paste the plan due to security restrictions (I work in a pseudo air gapped network)
Hi, I have a query with optional parameters and depending on whether you select 'ALL' or a specific item the execution plan will change. The reason for the wild difference is due to the use of Temp tables (a necessity for the 'ALL' scenario). The 'ALL' scenario returns like 250,000+ records whereas the specific item scenario returns <1000.
ALL Scenario
When I optimize the query (indexes specifically) for the ALL scenario, my execution plan will utilize unwanted parallelism and full index scans when the optional parameters (specific item) are used BUT will use key look ups and non-clustered index scans for when querying based on the 'ALL' parameter. In this scenario the "ALL" runs quickly, and the specific item will be faster than 'ALL' but much slower than if I optimize for the "Specific Item"
Specific Item Scenario
When I optimize for the parameters, the 'ALL' scenario will use full index scans everywhere, but the parameters will use key look up. In this scenario the 'ALL' takes anywhere from 11-16 seconds to run whereas the specific items will be like 600ms.
I have identified the following two solutions:
Find a way to professionally tell the customer we should have two stored procedures and to have the application call based on the parameters in the app.
Create a neatly commented and formatted IF..ELSE to create handle both scenarios individually
My question is this, are these the only two ways to handle this or is there a possible third solution I can explore? What is the best way to handle my dilemma? Both scenarios are used at roughly the same rate.
Hi.
When doing a SQL server upgrade from 2016 SP3 to 2022, im running into an issue when it is trying to finalize the Fulltext index part of the upgrade.
It seems like the upgrade locks itself out of the upgrade, by putting permissions on the Filterdatafolder, so it can no longer be accessed be the install process. When i try to check permissions on it with a local admin, i get access denied, ive even tried via psexec as SYSTEM but no luck.
The error im getting in the SQL Install is the following:
i can see its assosiated with an errorcode 30064.
Ive tried with different settings like rebuld and reset, but it makes no difference, it fails on the same step.
anyone experienced this, and how do i fix it?
I don't suppose anyone has to memory or can get hold of, the historic license cost increases. Specifically for Software Assurance only renewals. An average per-year % increase is perfect.
I'm cost analysing MPSA vs. ESA vs. SCE for the next 10 years (simply to align to SQL support lifecycle). Typically we've done MPSA without SA because of the basically zero need to keep to the latest version unless a specific app requires it, but we're currently maxed at 2019 (different cores licensed to different versions).
Since 2022 in virtual deployments now requires SA I need to start factoring that in.
Thanks
Classic story
Giant stored procedure that's causing 30% of the CPU utilization
Ran it through the advisor, it recommended 5 indexes, I applied them, utilization dropped down to 2%
I know not to blindly apply indexes, so I would like to learn what's going on
So what do I do next? How do I learn to read/understand query execution plans?
I have a couple of "deployment techs" that setup new databases on our production sql server. I know, but, I have no say over who does what.
They aren't super knowledgeable about sql server in general, but know enough to run a sql script given to them by a developer. And how to restore a database backup.
What I need to do is force an Active Directory group to have read/write permissions to every database. The users in that AD group are all service accounts that run various processes against the databases like ETL loads or address correction, name cleaning and so on. But, I don't trust the deployment users to always set permissions correctly which can cause lots of common tasks to fail.
Adding the AD group to the Model db would, I think, cover databases created via sql script (CREATE DATABASE...). But how can I set it up such that any database that is restored to the server from a backup uses the default permissions setup in Model, or elsewhere?
Hi! I need your help. I have 2 entries for the pokémon Venusaur (Venusaur, Male, Shiny - Venusaur, Female, NotShiny). I want to retrieve zero rows for shiny = false regardless of gender. How can I achieve this?
I mean, I want to know if I have shiny Venusaur regardless of gender, but this pokemon is returned for the non-shiny list and is returned for the shiny list. I want it to be returned ONLY in the shiny list, and not in the non-shiny list.
Edit: Issue has been fixed with the following logic.
SELECT sc.[Dex#], sc.[Pokémon],
MAX(CASE WHEN sc.[Shiny?] = 'TRUE' THEN 1 ELSE 0 END) as shinyFlag
FROM ShiniesCaught as sc
GROUP BY sc.[Dex#], sc.[Pokémon]
ORDER BY sc.[Dex#]
THANK YOU to everyone who replied. You guys are awesome. Thank you. Much love
I've inherited this platform and trying to understand how to suppress these 400,000 emails that build up over the course of a month. Just to preface there is sort of an invisible disconnect here between how we respond to alerts and the alerts raised.
For example (using arbitrary values from here on):
in Idera if a host is raising a critical alert because of disk X is 90% full, and it's never addressed, the alert raised threshold will inevitably be met (say it's something conservative like 15 minutes the alert has to be raised before a response is triggered), and the host will constantly have a critical alert on it each time it refreshes, never changing severity.
So, I understand that there is the Alert Suppression page for many metrics that allow you to set a threshold wherein the alert needs to have met a certain threshold for X amount of minutes before being raised. In other words, if a metric exceeds a threshold, Idera reports an alert (informational, warning, or critical). That makes sense to me.
One of our alert responses for critical alerts is configured in such a way that it's set so that "Where metric severity has changed" is enabled, in addition to, "Where metric severity is unchanged for a specific time period".
In the rule description it reads as "severity is Critical and metric severity has unchanged specific time frame 4 minutes", followed by email actions.
If an alert is "still" raised every refresh in the same state and was not snoozed or addressed, it would stand to reason that the severity is not changing. Do I need to uncheck the "Where metric severity is unchanged for a specific time period"?
The goal of this would be so that we only get one email ever for any given alert that we can then act on, instead of having to dig through 400k emails.
I've been using SSRS for quite a few years now and didn't think this would be as confounding as it is. I'm really hoping my brain is just melting today but I cannot figure out how to get this to work.
I'm attempting to give a report of customer accounts that have had Bad meter reads the last 2 attempts. For simplicy sake my data has three columns that matter:
Account | Date | Type |
---|---|---|
100 | 01/01/2024 | A |
100 | 04/01/2024 | E |
100 | 07/01/2024 | A |
100 | 10/01/2024 | E |
200 | 01/01/2024 | E |
200 | 04/01/2024 | E |
200 | 07/01/2024 | A |
200 | 10/01/2024 | A |
300 | 01/01/2024 | A |
300 | 04/01/2024 | E |
300 | 07/01/2024 | E |
300 | 10/01/2024 | E |
The report is grouped by Account. I want to set Visibility to True ONLY if there last 2 consecutive read TYPE = "E".
So in this data example I only want it to show Account 300.
I've been able to get the count just fine a few different ways, but for the life of me I can't get SSRS to accept the value for use in hiding the group. It's either "Unable to use ReportItems except in page header or footer", or "unable to do aggregates on ReportItems not in scope".
Does anyone have a suggestion as to how to approach this? I feel like I've conquored this before but I'm pulling my hair out.
Is there a difference between
CREATE OR ALTER FUNCTION [FTest](...)
RETURNS TABLE
AS
RETURN
(
SELECT [SomeCol]
FROM [SomeTable]
)
and
CREATE OR ALTER FUNCTION [FTest](...)
RETURNS @TempTable TABLE
(
[SomeCol] [INT] PRIMARY KEY
)
AS
BEGIN
INSERT INTO @TempTable
SELECT [SomeCol]
FROM [SomeTable]
RETURN
END
E.g. do they both copy data into a temp table or is the first one more like a view? If it makes a temp table, what about indexing?
I've used SQL Sentry for years with SQL Sentry itself and all monitored databases located on-prem. We are going to migrate one of our SQL Server instances to AWS on EC2. Should I expect SQL Sentry to still work okay assuming it can still hit the instance? Any special considerations or settings I need to change in SQL Sentry for this soon-to-be remote instance?
The web application developed in .net 4.8, sql 19 is not able to handle more number of users at a time. getting deadlocks.. Lot of escalations from the client. please help me make this application smooth and fast
I want to create an .exe file that installs SQL Server Express along with the database and configurations automatically on my clients' computers (db local without server). I read the license, but I still have some doubts. 'Add significant primary functionality to it in your programs' — does this mean adding some functionality to SQL Server Express? Or does it mean that SQL Server Express should be a primary component of my program? Maybe I don’t understand it correctly due to my level of English. Source: https://www.microsoft.com/en-us/useterms/#areaheading-uid6738233 Thanks for your time
Is there a way to run Backup Cert in SQL and have the backup file owner as something other than the SQL service account?
Reasoning behind this is I have a requirement to take monthly backup of TDE certs across a few hundred servers, bring them back to a central server, then zip them up into a password protected 7zip. Have all this working, until I realized that the backup of the certificate is owned by the SQL Service account and I can't programmatically add it to archive as the account I'm running powershell script will have to be a different account?
I know I can change the ACL in the script but that would require allowing elevated privileges (which the account the script will run under has) but I want to throw it in a monthly SQL job.
Thanks!
I'm just curious if anybody is using Powershell DSC to manage their onprem SQL Servers. If so, are you using the repository here:
https://github.com/dsccommunity/SqlServerDsc
And is Powershell DSC 3.0 safe to use in production for provisioning new servers or is it recommended to stick with v1/v2?
I've been through really tough situations throughout my almost two years of being a SQL DBA in a bank.
The tasks themselves are not hard and I try to be proactive and I daily check on all our instances and try to make sure everything is running well. But sometimes shit happens and whoever is using an app that connects to database with an issue don't have the patience and all of a sudden you get reported to high management.
So, how can someone survive this job?
As a DBA in training, I'm researching DB optimizations. I started out blind and green using this subs recommendation for Brent Ozar. His emails and explanation pages have really opened my eyes.
My Question is this: When I use the Database Engine Tuning Advisor, is it optimal to choose the Query Store option after having enabled Query Store. There are options for Plan Cache and others.
It is not clear to me what options go for what workload. The workload being Query Store and post analysis.
Thanks!
Hi all,
I'm getting this strange message when any user login, windows or local, tries to connect but does not have sysadmin permissions. This appears when users login to either of the nodes in my always-on cluster. This does not happen on other servers for the same users. I have tried rebooting one of the nodes and this still persisted. I can make this go away by granting connect for public role, but that should be implicit? Any insight would be helpful.
Thanks!
I want to make a transition to DBA, in my current role I essentially fill the role of a junior DBA, I do simple back up policies, I optimize indexes, and query tune.
I currently lack knowledge in the server upgrade process, setting up a server from scratch, VMs, and cloud hosting. These are things that I am trying to get via self study.
In addition to getting crucial knowledge about the previously mentioned stuff what are some non-SQLs I should get to accommodate the soon to be acquired knowledge?