This post is basically to answer a question asked in class this week: How can we get the last statistics update date for ALL user tables in a database? After working on the query for a while, I realized that the new metadata function I posted about here can give you that info easily: SELECT object_name(sp.object_id) as object_name,name as stats_name, sp.stats_id, last_updated, rows, rows_sampled, steps, unfiltered_rows, modification_counter FROM sys.stats AS s CROSS APPLY sys.dm_db_stats_properties(s.object_id,…(read more)
Nested loops join query plans can be a lot more interesting (and complicated!) than is commonly realized. One query plan area I get asked about a lot is prefetching. It is not documented in full detail anywhere, so this seems like a good topic to address in a blog post. The examples used in this article are based on questions asked by Adam Machanic . Test Query The following query uses the AdventureWorks sample database (as usual). Run it a couple of times if necessary so you see the query plan when…(read more)
I had a database where the log file kept growing and used 99.99% of the available space. The database recovery model was set to “SIMPLE” and there was no replication setup for this database, but the transaction log kept growing. The next thing I looked at was Change Data Capture (CDC) to see if that was the issue. I ran the query below to see if CDC was enabled for this database, but it didn’t return anything. I was still having issues with this database and I noticed miscellaneous change data capture objects still in the database as well as open transactions. This was causing my transaction log to continue to grow, but I couldn’t disable CDC, because SQL Server thought it was not enabled.
Fellow MVP Paul Randal just blogged about an issue he has discovered with how the master and model databases are created in SQL Server 2005, SQL Server 2008, SQL Server 2008 R2, and SQL Server 2012. I just checked, and the same issue is present in SQL Server 2014 CTP1 (Build 11.0.9120).
Here is a more heavily commented version of Paul’s code to detect and fix the issue (that is also easier to copy and paste). Please read Paul’s post for more details and background.
-- Fixing a Bug: DBCC CHECKDB data purity checks are skipped for master and model -- By: Paul Randal -- Posted on: August 29, 2013 1:15 pm -- http://www.sqlskills.com/blogs/paul/bug-dbcc-checkdb-data-purity-checks-are-skipped-for-master-and-model/ -- Turn on a session-level trace flag DBCC TRACEON (3604); -- Look for the value of dbi_dbccFlags = 0. We want it to be dbi_dbccFlags = 2 DBCC DBINFO (N'master'); GO DBCC DBINFO (N'model'); GO -- This will set the value for dbi_dbccFlags correctly as part of running DBCC CHECKDB DBCC CHECKDB (N'master') WITH DATA_PURITY, NO_INFOMSGS; GO DBCC CHECKDB (N'model') WITH DATA_PURITY, NO_INFOMSGS; GO -- Look for the value of dbi_dbccFlags = 2 DBCC DBINFO (N'master'); GO DBCC DBINFO (N'model'); GO -- Turn off the trace flag (even though it is just a session trace flag) DBCC TRACEOFF (3604);
Next months look full of interesting conferences and I’ll be speaking in several of those. I’ll be speaking at the following ones:
17 – 18 September 2013, Germany
SQL Saturday 260 Milano
8 October 2013, Italy
SQL PASS Nordic
4 – 6 November 2013, Sweden
SQL PASS Amsterdam
6 – 8 November 2013, Nederland
SQL Saturday 257 Verona
9 November 2013, Italy
Advanced Data Warehouse Workshop
20 November 2013, Stockholm
SQL Saturday 264 Ancona
13 December 2013, Italy
I’ll deliver session on Data Quality & Master Data Management, Techniques for the Agile Business Intelligence, Automating ETL using Metadata (and BIML) and, at SQL PASS Nordic I’ll also deliver, again with my friend Thomas Kejser, a very nice workshop:
Data Warehouse Modeling – Making the Right Choices
It’s very likely that I’ll be missing the PASS Summit this year, but at some point I also have to do some work here and now…:)
If you’re going to be to one of those conferences, please come and say “Hi”: it’s always great to meet people all around the world!
SQL is derided by modern developers as ‘Scarcely Qualifies as a Language’. But just how efficient are the new wave of NoSQL languages touted by bleeding-edge skunk works? This tip is a defense of SQL and of the relational model, and argues for the efficiency and suitability of relational technology. Check out this tip to learn more.
While researching an upcoming blog post on using SQL Server 2012 to help college students study any subject more effectively, I stumbled across Project Tuva. Bill Gates bought the rights to some of Nobel Laureate Richard Feynman’s physics lectures and made them available for free at http://research.microsoft.com/apps/tools/tuva/ . Speaking of recorded lectures, I highly recommend that students listen to podcasts of lectures during their commutes. The main reason I made it through pharmacology and…(read more)
Join community experts and Microsoft leaders October 15-18th at 2013 PASS Summit in Charlotte! Quentin Clark, VP of Microsoft’s Data Platform Group, is returning to kick off the event with Wednesday’s keynote talking about innovations behind Microsoft’s database platform and business intelligence solutions – on premise and in the cloud. Microsoft Technical Fellow Gray Systems Lab, David DeWitt, will share deep insights on In-Memory OLTP’s ability to provide performance improvements and transactional workloads.
Microsoft has an excellent line up of speakers with deep technical sessions on In-Memory OLTP, Power BI, Windows Azure security, and much, much more. You also won’t want to miss the new interactive workshops, chalk talks from experts from Microsoft’s Data Platforms Group, and everyone’s favorite SQL Clinic with architecture advice from the Azure CAT team and tech support from CSS.
Here are a few examples:
AlwaysOn in SQL Server 2014 [DBA-301-M] – In this session, we’ll discuss the AlwaysOn (Availability Groups and Failover Cluster Instances) enhancements in SQL Server 2014, from increasing the number of secondary replicas and keeping them available for reads at all times to deploying them easily to Windows Azure. You’ll also learn how SQL Server failover clustering has been enhanced with support for cluster shared volumes and how we improved supportability for AlwaysOn by adding better debugging and logging information. You’ll see these enhancements in action and walk away understanding their business value in your organization.
CAT: From Question to Insight with HDInsight and BI [AD-401-M] – Join this session for the end-to-end story of how to implement a system to answer new questions and gain business insights. You’ll get a deep understanding of the cloud and on-premise options for HDInsight, how to load data to the Azure blob store, creating an HDInsight Service cluster, adding additional storage locations, and adding structure with Hive objects. Then, we’ll look at how to analyze and visualize the data using the Microsoft BI tools to gain insights that can change your business.
CAT: SQL Server on Azure VM – Performance and Best Practices [DBA-305-M] – Learn the performance characteristics of running SQL Server on Windows Azure Virtual Machine (Infrastructure Services). What level of performance you can expect from the infrastructure? What can you do to mitigate, tune, and improve performance to meet application needs? We’ll share the various performance metrics we’ve observed during internal tests in the Engineering team, as well as in early customer deployments.
See the full list of sessions at http://www.sqlpass.org/summit/2013/Sessions.aspx
Return of Microsoft Certification testing onsite: Microsoft Learning will be next-door from the convention center at the Charlotte Westin hotel sitting Microsoft certification exams, free group exam crams, and free practice tests in the self-study room. Take advantage of the 50% discount for all MSCA and MSCE exams – only $75 per exam! Look out for the official registration link later this week.
Want to learn more about the testing? See the post “Training Opportunities at PASS Summit 2013 – Register Now!” from Thomas LaRock to answer some of the questions you may have around the certification and training.
Looking forward to seeing everyone in Charlotte!
Almost two years ago, I wrote about a method to use Table-Valued Parameters in SQL 2005 – or basically any environment that doesn’t support them natively.
The idea was to use a View with an ‘instead of’ trigger. Essentially, the trigger acts as a stored procedure, which is then used to be able to handle all the rows however you want. That could be distributing the rows into tables as if it’s an actual insert, but also works to just run whatever code you like, as a trigger is essentially just a stored procedure.
So anyway – today I got challenged to make it more explicit that this also works within SQL Server Integration Services. SSIS handles batch inputs into views, but if you were hoping to push data into a stored procedure in a data flow, that’s going to be on a row-by-row basis – no TVPs. I’d described it to a class I was teaching, showed them the code from my 2011 blog post, and they pointed out that I hadn’t made it particularly obvious (in my blog post) that you could use that method in SSIS.
The idea is very simple – you just create a view, as I did in my earlier post, and then use it as a destination within SSIS. This can be used for all kinds of methods for which you’d want to use a TVP, including pushing data up into SQL Azure databases.
I am honored to present a day-long preconference before SQL Saturday 255 (Dallas) entitled A Day of SSIS 2012 . Please register here . I’m excited to visit the Dallas area! Here’s some information about the precon – I hope to see you there! Training Description A Day of SSIS was developed by Andy Leonard to train technology professionals in the fine art of using SQL Server Integration Services (SSIS) to build data integration and Extract-Transform-Load (ETL) solutions. The training is focused around…(read more)