Category Archives: pass summit

Why Your Abstract Wasn’t Selected

We’re anxiously waiting to hear from PASS which sessions were selected for the 2014 Summit in November. It’s a big job to go through the hundreds of submissions and pick the sessions that will appeal to the people who will be paying over $1,000 to attend this annual event. As I am also waiting to hear the results, I saw this article addressed to actors who didn’t get cast for the part they worked so hard to audition for, and it seemed appropriate to address the same issues for would-be Summit speakers….(read more)

Abstract Submission Day for Speaking 2014…

Today is my least favorite part of the whole speaking thing. It is the day that I put myself out there and wait to see if I get rejected or accepted. Rejection stinks, but at the same time, being accepted means more work. Neither are perfect, and as I stated in my last post , I also have a goal of not getting myself stuck spending 7 hours a day doing work outside of my day job. I plan to put in abstracts to most of the conferences I expect to attend this year. And when I say attend, I mean I plan…(read more)

SQL in the City (Charlotte) Wrap Up

Ok, it has been quite a while since the event, two weeks and a day to be exact, but I needed a rest before hitting Windows Live Writer again. Speaking is exhausting, traveling is exhausting, and well, I replaced my laptop and had to get all of my software back together. (Between Windows 8.1 sync features, Dropbox and Skydrive, it has never been easier…but I digress.) There are plenty of great vendors out there, but one of my favorites has always been Red-Gate. I have written half of a book with them,…(read more)

Cloud Data Warehousing – The Fastest Time To Value

In the past, deploying a data warehouse has been a costly affair.  IT departments required expertise to procure and build state-of-the-art hardware servers (that are optimally balanced from a CPU, storage and I/O perspective) as well as install software that is tuned for optimal performance.  Prior to even loading data into the system, you could already be months into the project and hundreds of thousands of dollars into your infrastructure investment, not to mention the resources needed to maintain these servers and run them at peak performance levels.

To address these challenges, Microsoft announced at the PASS Summit conference the ability to deploy a tuned SQL Server Enterprise image for data warehousing in the cloud on Windows Azure.  We now deliver a pre-tuned version of SQL Server Enterprise specifically for data warehousing and host this in a Windows Azure Virtual Machine. This image leverages the Microsoft best practices from the on-premise Fast Track reference architecture certification to tune SQL Server for data warehousing in Windows Azure. Users can provision a highly tuned data warehouse image within minutes without any knowledge of Azure storage configurations nor needing expertise on how to optimize SQL Server for data warehousing workloads. This is an ideal solution for customers who want a data warehouse quickly without managing a hardware infrastructure.

Microsoft has built strong momentum in the cloud with 50% of the Fortune 500 and adding 7,000 new Azure IaaS customers a week.  HarperCollins Publishers was one of these customers who deployed SQL Server 2012 Enterprise solution in Azure within two weeks ultimately saving them more than 6 months of deployment time and over $200,000 in development costs. Deploying SQL Server for data warehousing in Windows Azure Virtual Machines gives customers the fastest time to value at the lowest cost.

With this cloud data warehouse solution, Microsoft rounds out a suite of existing data warehouse offerings that already includes reference architectures and appliances. Customers now have an unprecedented number of options to deploy a data warehouse on-premise, in the cloud, or hybrid (both on-premise and cloud).

We are excited to enable you to deploy SQL Server for data warehousing in Windows Azure Virtual Machines and invite you to learn more through the following resources:

Live from the #summit13 keynote : 2013-10-17

Douglas McDowell (EVP Finance) takes the stage (no kilt), and talks numbers. PASS has an impressive $1MM in reserves as a “rainy day” fund. Last fiscal year they spent $7.6MM on community; 30% of that was spent internationally. Bill Graziano comes on (no kilt) to say goodbye and thanks to the outgoing board members, Douglas McDowell, Rob Farley and Rushabh Mehta. Thomas LaRock comes on. No kilt, but he did tuck his shirt in. He introduces the incoming executive team. The 2014 PASS Business Analytics…(read more)

The 2013 PASS Summit – Day 2

Good morning! It’s Day 2 of the PASS Summit 2013 and it should be a busy one. Douglas McDowell, EVP Finance of PASS opened up the keynote to welcome people and talked about the financial status of the organization. Last year’s Business Analytics Conference left the organization $100,000 ahead, and he went on to show the overall financial health, which is very good at this point. Bill Graziano came out to thank Doug, Rob Farley and Rushabh Mehta for their service on the board, as they step down from…(read more)

Speaking – Automate Your ETL Infrastructure with SSIS and PowerShell

Today at 4:45PM EDT I’m presenting a new session using PowerShell to auto-generate SSIS packages via the BIML language. The really cool thing is that this session will be live broadcast on PASS TV! You can view the session by clicking on this link . If you have questions for me during the session, you can send them to me via Twitter using this hashtag: #posh2biml Brian Davis, my good friend from the Ohio North SQL Server Users Group, will be monitoring that hashtag and feeding me the questions that…(read more)

SQL Server 2014: Pushing the Boundaries of In-Memory Performance

This morning, during my keynote at the Professional Association of SQL Server (PASS) Summit 2013, I discussed how customers are pushing the boundaries of what’s possible for businesses today using the advanced technologies in our data platform. It was my pleasure to announce the second Community Technology Preview (CTP2) of SQL Server 2014 which features breakthrough performance with In-Memory OLTP and simplified backup and disaster recovery in Windows Azure.

Pushing the boundaries

We are pushing the boundaries of our data platform with breakthrough performance, cloud capabilities and the pace of delivery to our customers. Last year at PASS Summit, we announced our In-Memory OLTP project “Hekaton” and since then released SQL Server 2012 Parallel Data Warehouse and public previews of Windows Azure HDInsight and Power BI for Office 365. Today we have SQL Server 2014 CTP2, our public and production-ready release shipping a mere 18 months after SQL Server 2012. 

Our drive to push the boundaries comes from recognizing that the world around data is changing.

  • Our customers are demanding more from their data – higher levels of availability as their businesses scale and globalize, major advancements in performance to align to the more real-time nature of business, and more flexibility to keep up with the pace of their innovation. So we provide in-memory, cloud-scale, and hybrid solutions. 
  • Our customers are storing and collecting more data – machine signals, devices, services and data from outside even their organizations. So we invest in scaling the database and a Hadoop-based solution. 
  • Our customers are seeking the value of new insights for their business. So we offer them self-service BI in Office 365 delivering powerful analytics through a ubiquitous product and empowering users with new, more accessible ways of gaining insights. 

In-memory in the box for breakthrough performance

A few weeks ago, one of our competitors announced plans to build an in-memory column store into their database product some day in the future. We shipped similar technology two years ago in SQL Server 2012, and have continued to advance that technology in SQL Server 2012 Parallel Data Warehouse and now with SQL Server 2014. In addition to our in-memory columnar support in SQL Server 2014, we are also pushing the boundaries of performance with in-memory online transaction processing (OLTP). A year ago we announced project “Hekaton,” and today we have customers realizing performance gains of up to 30x. This work, combined with our early investments in Analysis Services and Excel, means Microsoft is delivering the most complete in-memory capabilities for all data workloads – analytics, data warehousing and OLTP. 

We do this to allow our customers to make breakthroughs for their businesses. SQL Server is enabling them to rethink how they can accelerate and exceed the speed of their business.

 Sven Lowry TPP

  • TPP is a clinical software provider managing more than 30 million patient records – half the patients in England – including 200,000 active registered users from the UK’s National Health Service.  Their systems handle 640 million transactions per day, peaking at 34,700 transactions per second. They tested a next-generation version of their software with the SQL Server 2014 in-memory capabilities, which has enabled their application to run seven times faster than before – all of this done and running in half a day. 
  • Ferranti provides solutions for the energy market worldwide, collecting massive amounts of data using smart metering. With our in-memory technology they can now process a continuous data flow up to 200 million measurement channels making the system fully capable of meeting the demands of smart meter technology.
  • SBI Liquidity Market in Japan provides online services for foreign currency trading. By adopting SQL Server 2014, the company has increased throughput from 35,000 to 200,000 transactions per second. They now have a trading platform that is ready to take on the global marketplace.

A closer look into In-memory OLTP

Previously, I wrote about the journey of the in-memory OLTP project Hekaton, where a group of SQL Server database engineers collaborated with Microsoft Research. Changes in the ratios between CPU performance, IO latencies and bandwidth, cache and memory sizes as well as innovations in networking and storage were changing assumptions and design for the next generation of data processing products. This gave us the opening to push the boundaries of what we could engineer without the constraints that existed when relational databases were first built many years ago. 

Challenging those assumptions, we engineered for dramatically changing latencies and throughput for so-called “hot” transactional tables in the database. Lock-free, row-versioning data structures and compiling T-SQL and queries into native code, combined with making the programming semantics consistent with SQL Server means our customers can apply the performance benefits of extreme transaction processing without application rewrites or the adoption of entirely new products. 

Transformational In-Memory Performance

The continuous data platform

Windows Azure fulfills new scenarios for our customers – transcending what is on-premises or in the cloud. Microsoft is providing a continuous platform from our traditional products that are run on-premises to our cloud offerings. 

With SQL Server 2014, we are bringing the cloud into the box. We are delivering high availability and disaster recovery on Windows Azure built right into the database. This enables customers to benefit from our global datacenters: AlwaysOn Availability Groups that span on-premises and Windows Azure Virtual Machines, database backups directly into Windows Azure storage, and even the ability to store and run database files directly in Windows Azure storage. That last scenario really does something interesting – now you can have an infinitely-sized hard drive with incredible disaster recovery properties with all the great local latency and performance of the on-premises database server. 

We’re not just providing easy backup in SQL Server 2014, today we announced backup to Windows Azure would be available for all our currently supported SQL Server releases. Together, the backup to Windows Azure capabilities in SQL Server 2014 and via the standalone tool offer customers a single, cost-effective backup strategy for secure off-site storage with encryption and compression across all supported versions of SQL Server.

By having a complete and continuous data platform we strive to empower billions of people to get value from their data. It’s why I am so excited to announce the availability of SQL Server 2014 CTP2, hot on the heels of the fastest-adopted release in SQL Server’s history, SQL Server 2012. Today, more businesses solve their data processing needs with SQL Server than any other database. It’s about empowering the world to push the boundaries.

Quentin Clark
Corporate Vice President
Data Platform Group

Live from the #summit13 keynote : 2013-10-16

Early morning start here in Charlotte. I’m going to try and keep this post updated as I have new information from the keynote to share, so refresh often! 8:24 AM Bill Graziano takes the stage and welcomes us to the 15th PASS Summit. He mentions that PASS delivered over 700,000 hours of technical training in the previous fiscal year, and shows a Power BI Power Map video talking about all of the SQL Saturday accomplishments in the last few years. She introduces Amy Lewis, who wins this year’s PASSion…(read more)

The 2013 PASS Summit – Day 1

It’s SQL Server Geek Week once again! Every year at the PASS Summit the SQL Server faithful descend on the city of choice for the annual Summit, and this year it’s Charlotte, North Carolina. Once again I’ve been given the privilege of sitting at the bloggers table, so my laptop is on a table! So far this week it’s been great seeing people I get to see just once a year. I attended Red Gate’s SQL in the City event on Monday, and saw some great sessions from Grant Fritchey, Steve Jones and Nigel Sammy….(read more)