Updating Statistics

Statistics and sql server. When people ask me what are statistics I answer with a Google maps analogy: while we know the roads are there, the traffic changes and, therefore, in the same way we know it is best to patiently wait for the traffic to get updated before choose what route to take, we have to give sql server a way to keep up with the data present in the tables. I choose to have all statistics updated at a fixed time – 7 pm actually – unless there are large loads overnight. If that is the case then the job runs after the loads.

This is the sql I use for updating statistics. As you can see, it does reference other objects and while I’ll eventually get to sharing, it won’t run properly without them. Why a custom procedure? Check out what these smart people say:

  1. www.scarydba.com
  2. www.sqlshack.com
  3. www.brentozar.com
  4. etc…

What I got from these posts is don’t run the default procedure because it’ll reread your data and items will get updated which don’t need attention. That sounds inefficient. What I also wanted was logging to see what was requiring updates and avoid blocking. I include an input parameter to give the count of objects requiring updates in case I am curious.

Posted in Configuration, DBA, SQL | 2 Comments

SQL Licensing and the Cluster

We licence our servers by core as we are all Enterprise edition. This choice is determined by the memory requirements of some systems not being met by Standard edition, to be able to mirror (for 2008 R2 and high performance for 2012), and that processor power is not a limiter. For us, then, on the 4 node Cluster we have paid for 2 active nodes worth of cores and filled them all up to the maximum memory of 512 GB. 3 SQL Server instances require more than the 64 GB Standard edition limit to perform well; one gets all 512 GB while the remaining 2 along with 4 more instances share the other blade’s 512 GB. We’ve discussed, though not yet implemented, disabling sockets so we can afford more blades for other areas. With the VM SQL Servers we also ended up paying by core and have all of them on a limited number of blades.

Posted in DBA | 2 Comments

Making use of the model Database

All the websites and books talk about how the model database is what all future databases are based on, including tempdb, so add important stuff there. In my experience, there is not a great need to have the same function in every database. In fact, what a pain for maintenance would that be! I’ve had to put a user defined datatype into model once because tempdb required it because the third party app was written that way. The only thing which has proven to be good time and again for me and my model databases is setting file growth to 250MB for the log and data files. Anyone have any other genuine success stories of how configuring the model database saved the world?

Posted in Configuration | Leave a comment

Rehearsals = Good

I worked construction during high school and college summers and holidays. The people I met were a ton of fun and had great stories. The people who’d been incarcerated, they had the best. My move from construction to computers included many discussions centered on comparisons of the two. Something specific about computer work I really appreciate is the opportunity to rehearse. To say it another way, there is no CTRL+Z for the improperly cut 2×4. This gift is exquisite, should not be ignored and, when coupled with scripting skills, has phenomenal potential.

Let me tell a story about my first deployment as the primary dba at a company with a hub and spoke, bidirectionally configured transactional replication set up. I was working with committed professionals at a company which was prosperous and had good management. I didn’t want to be the problem! So I practiced. I had the system guys set me up a series of unused desktops on a separate domain named the same as the production sql servers they were to mimic. Then I set out creating the scripts which would drop the subscriptions and articles, execute the alter and create statements, and put it back together again. The computers were put in a mostly empty storage room and that is where I sat until I was ready. It was an epic weekend and the results were awesome: I took 3 hours to complete what the previous dba took about 30 hours to finish.

Of course there are times when there is no room for practice. At least the first time. Once the matter is resolved, stay serious and take a close look at why you were in that situation. How and what will you do to have the best chance of never going there again? Be humble and use every opportunity to rehearse.

Posted in DBA | Leave a comment

Execute As

We all want to have our databases not owned by an individual. Especially the model database. We can run alter authorization on database::<<db name>> to sa after the fact, but why make a problem? An easy way to avoid this situation without having to log in as sa is to run “execute as login=’sa’” prior to running the restore command. This is doubly useful when restoring databases with norecovery for mirroring when you can’t run the alter database later.

Posted in Configuration, DBA, SQL | Leave a comment

Searching RDLs

As the company is moving, we needed to search for hard coded address data in all the reports. This smart individual came up with an easy way to extract all the RDLs from the reportserver database. I was especially impressed because it used bcp instead of the obtuse rs.exe. At least in SQL 2008 there was rsscripter. Anyway, once the reports are there, most any better than notepad text editor can be used to search all files in the folder for a particular string. While of course it is best to have an include or url in the report to an image, this was an easy way to get the job done.

Posted in SQL, SSRS | Leave a comment

Educating Your Developers

Great departments have great people. No kidding! How do we end up with quality developers? Time has shown it we can’t hire the best for every position and, inevitably, something will come up which will require growth, whether personal or professional, to successfully overcome the challenge. I believe a dba can be a huge help in this area by giving classes on topics which will fill in department gaps. These are some of the classes I’ve given: SQL Profilier; using SQL constraints; performance tuning focusing on basics partly to be able to do something and partly to understand one’s impact; error handling; why memory is so valuable (this also comes up when discussing performance tuning); and reviewing of utility SQL such as the bcp procedure and stored procedure template.

Once there is a basic understanding more can take place in a smaller group or one on one. The goal is to make it so the developers can hand in SQL which always passes the code review and the SQL Servers are never the issue.

Posted in DBA, Leadership, SQL | Leave a comment

Hard Coded Values

Often we see code with business logic in the form of hard coded values. For example, a long, comma separated IN statement which lists out a category. These comma delimited lists are of course data duplication, nothing anyone wants. It would be worth the time to create views for these business definitions and use the views repeatedly. Functions are tricky because of their potential for great impact on performance. Tables which have multiple categories grouped by a parent id which self joins are also useful. Or dedicated tables too.

Posted in DBA, SQL | 2 Comments

Performance Tuning from a High Level

We had a report which ran “forever”. Seriously it was a long time. After restoring production backups on to a test instance, I ran the same sql. Not good. I could have grown the coconuts whose water I was drinking. There were some indexing opportunities and that made a difference on my test sql server, though on the production box, when running the stored procedure on the newly indexed tables, the wait was not acceptable. What was interesting is that in profilier for the production procedure execution, even stopping the after a few minutes (test was down to a few seconds at this point), the reads were over 10x what I saw on test for a properly completed run. On the other hand, running the procedure innards resulted in a good execution time and same reads as test. This made me think about parameter sniffing. Sure enough, after adding local variables which were set to the input parameter values we had a speedy execution.

Posted in DBA, SQL | Leave a comment

Searching for Emails in Tidal Scheduler

TES stores emails used in email actions in the table dbo.tskmail. If you want to find a particular email, then, you’d use a query like this:

select * from dbo.tskmail where tskmail_to_ext like ‘%your_email@somewhere.com%’

Posted in SQL, Tidal Enterprise Scheduler | Leave a comment