Category Archives: customer stories

Results are Beautiful: 4 Best Practices for Big Data in Healthcare

When you put big data to work, results can be beautiful. Especially when those results are as impactful as saving lives. Here are four best practice examples of how big data is being used in healthcare to improve, and often save, lives.

Aerocrine improves asthma care with near-real-time data

Millions of asthma sufferers worldwide depend on Aerocrine monitoring devices to diagnose and treat their disease effectively. But those devices are sensitive to small changes in ambient environment. That’s why Aerocrine is using a cloud analytics solution to boost reliability. Read more.

Virginia Tech advances DNA sequencing with cloud big data solution

DNA sequencing analysis is a form of life sciences research that has the potential to lead to a wide range of medical and pharmaceutical breakthroughs. However, this type of analysis requires supercomputing resources and Big Data storage that many researchers lack. Working through a grant provided by the National Science Foundation in partnership with Microsoft, a team of computer scientists at Virginia Tech addressed this challenge by developing an on-demand, cloud-computing model using the Windows Azure HDInsight Service. By moving to an on-demand cloud computing model, researchers will now have easier, more cost-effective access to DNA sequencing tools and resources, which could lead to even faster, more exciting advancements in medical research. Read more.

The Grameen Foundation expands global humanitarian efforts with cloud BI

Global nonprofit Grameen Foundation is dedicated to helping as many impoverished people as possible, which means continually improving the way Grameen works. To do so, it needed an ongoing sense of its programs’ performance. Grameen and Microsoft brought people and technology together to create a BI solution that helps program managers and financial staff: glean insights in minutes, not hours; expand services to more people; and make the best use of the foundation’s funding. Read more.

Ascribe transforms healthcare with faster access to information

Ascribe, a leading provider of IT solutions for the healthcare industry, wanted to help clinicians identify trends and improve services by supplying faster access to information. However, exploding volumes of structured and unstructured data hindered insight. To solve the problem, Ascribe designed a hybrid-cloud solution with built-in business intelligence (BI) tools based on Microsoft SQL Server 2012 and Windows Azure. Now, clinicians can respond faster with self-service BI tools. Read more.

Learn more about Microsoft’s big data solutions

Virginia Tech Exec Q&A

Virginia Tech is using the Microsoft Azure Cloud to create cloud-based tools to assist with medical breakthroughs via next-generation sequence (NGS) analysis. This NGS analysis requires both big computing and big data resources. A team of computer scientists at Virginia Tech is addressing this challenge by developing an on-demand, cloud-computing model using the Azure HDInsight Service. By moving to an on-demand cloud computing model, researchers will now have easier, more cost-effective access to DNA sequencing tools and resources, which could lead to even faster, more exciting advancements in medical research.

We caught up with Wu Feng, Professor in the Department of Computer Science and Department of Electrical & Computer Engineering and the Health Sciences at Virginia Tech, to discuss the benefits he is seeing with cloud computing.

Q: What is the main goal of your work?

We are working on accelerating our ability to use computing to assist in the discovery of medical breakthroughs, including the holy grain of “computing a cure” for cancer. While we are just one piece of a giant pipeline in this research, we seek to use computing to more rapidly understand where cancer starts in the DNA. If we could identify where and when mutations are occurring, it could provide an indication of which pathways may be responsible for the cancer and could, in turn, help identify targets to help cure the cancer. It’s like finding a “needle in a haystack,” but in this case we are searching through massive amounts of genomic data to try to find these “needles” and how they connect and relate to each other “within the haystack.”

Q: What are some ways technology is helping you?

We want to enable the scientists, engineers, physicists and geneticists and equip them with tools so they can focus on their craft and not on the computing. There are many interesting computing and big data questions that we can help them with, along this journey of discovery.

Q: Why is cloud computing with Microsoft so important to you?

The cloud can accelerate discovery and innovation by computing answers faster, particularly when you don’t have bountiful computing resources at your disposal. It enables people to compute on data sets that they might not have otherwise tried because they didn’t have ready access to such resources.

For any institution, whether a company, government lab or university, the cost of creating or updating datacenter infrastructure, such as the building, the power and cooling, and the raised floors, just so a small group of people can use the resource, can outweigh the benefits. Having a cloud environment with Microsoft allows us to leverage the economies of scale to aggregate computational horsepower on demand and give users the ability to compute big data, while not having to incur the institutional overhead of personally housing, operating and maintaining such a facility.

Q: Do you see similar applications for businesses?

Just as the Internet leveled the playing field and served as a renaissance for small businesses, particularly those involved with e-commerce, so will the cloud. By commoditizing “big data” analytics in the cloud, small businesses will be able to intelligently mine data to extract insight with activities, such as supply-chain economics and personalized marketing and advertising.

Furthermore, quantitative analytic tools, such as Excel DataScope in the cloud, can enable financial advisors to accelerate data-driven decision-making via commoditized financial analytics and prediction. Specifically, Excel DataScope delivers data analytics, machine learning and information visualization to the Microsoft Azure Cloud.

In any case, just like in the life sciences, these financial entities have their own sources of data deluge. One example is trades and quotes (TAQ), where the amount of financial information is also increasing exponentially. Unfortunately, to make the analytics process on the TAQ data a more tractable one, the data is often triaged into summary format and thus could potentially and inadvertently filter out critical data that should not have been.

Q: Are you saving money or time or experiencing other benefits?

Back when we first thought of this approach, we were wondering if it would even a feasible solution for the cloud. For example, with so much data to upload to the cloud, would the cost of transferring data from the client to the cloud outweigh the benefits of computing in the cloud?  With our cloud-enabling of a popular genome analysis pipeline, combined with our synergistic co-design of the algorithms, software, and hardware in the genome analysis pipeline, we realized about a three-fold speed-up over the traditional client-based solution.

Q: What does the future look like?

There is big business in computing technology, whether it is explicit, as in the case of personal computers and laptops, or implicit, as in the case of smartphones, TVs or automobiles. Just look how far we have come over the past seven years with mobile devices. However, the real business isn’t in the devices themselves, it’s in the ecosystem and content that supports these devices: the electronic commerce that happens behind the scenes. In another five years, I foresee the same thing happening with cloud computing. It will become a democratized resource for the masses. It will get to the point where it will be just as easy to use storage in the cloud as it will be to flip a light switch; we won’t think twice about it. The future of computing and data lies in the cloud, and I’m excited to be there as it happens.

 

For more information about Azure HDInsight, check out the website and start a free trial today.

Edgenet Exec Q&A

Edgenet provides optimized product data for suppliers, retailers and search engines. Used online and in stores, Edgenet solutions ensure that businesses and consumers can make purchasing and inventory decisions based on accurate product information. Last year, it implemented an In-Memory OLTP solution built on SQL Server 2014, which has helped it continue to innovate and lead in its business

We caught up with Michael Steineke, Vice President of IT at Edgenet, to discuss the benefits he has seen since Edgenet implemented SQL Server 2014.

Q: Can you give us a quick overview of what Edgenet does?

A: We develop software that helps retailers sell products in the home building and automotive industries. We work with both large and small dealers and provide software that helps determine and compare which products are in a local store.

We provide the specs, pictures, manuals, diagrams, and all the rest of the information that a customer would need to make an informed decision. We take all of this data, standardize it, and provide it to retailers and search engines.

With the major shift to online sales over the past handful of years, retailers need to have relevant and timely product information available so the customer can compare products and buy the best one for their needs.

In a single store, inventory is easy. In a chain where you have 1,000 or 5,000 stores, that gets very complicated. Our company is built on product data, and we need a powerful solution to manage it.

Q: What is your technology solution?

A: We are using In-Memory OLTP based on SQL Server 2014 to power our inventory search. This is where SQL Server 2014 comes in. Our applications make sure we have the right product listed, pricing and availability, and we couldn’t do it without In-Memory OLTP.

Q: What types of benefits have you seen since deployment?

A: SQL Server 2014 and OLTP have helped change our business. Our clients are happy as well. No matter what our customers need, we can do it with our solution. If a retailer wants to supply the data to us every 10 minutes, we can update every 10 minutes. It’s the way we like to do our business.

Q: Why did you choose to deploy SQL 2014 in your organization?

A: Working with Microsoft was a natural choice since we often are early adopters with new technologies. Our goal is to utilize new feature sets of new software as much as possible so we stay innovators in the field. That was the main reason we were so excited to deploy the In-Memory OLTP features with SQL Server 2014.

Q: What type of data are you managing?

A: Our inventory data isn’t extremely large, but there is a lot of volatility with it. We are talking about managing thousands of products across thousands of stores, with different pricing and availability for each store. There could be hundreds of millions of rows for just one retailer. Our big data implementation is around managing this volatility in the market, and we need a powerful back-end solution to help us handle all sorts of information.

Q: What are the advantages of In-Memory OLTP?

A: The biggest advantage we are getting is the ability to continually keep data up-to-date, so we always have real-time inventory and pricing. While we are updating we can continue to use the same tables, with little or no impact on performance. We were also able to consolidate a database used for the application to read that was refreshed daily and a database that consumed the updates from our customers, to one In-Memory database.

 

For more information about SQL Server 2014, check out the website and start a free trial today.

SQL Server 2014 is Customer Tested!

At Microsoft, we have an important program in place to work closely with our customers to ensure high-quality, real-world testing of Microsoft SQL Server before it hits the market for general availability. Internally, we call this the Technology Adoption Program (TAP). It works like this: an exclusive list of customers are invited to collaborate with us very early in the development lifecycle, and together, we figure out which features they benefit the most from testing and which workload (or scenario) they will use. They test the upgrade process, and then exploit the new feature(s), as applicable. Many of these customers end up moving their test workloads into their production environments up to six months prior to the release of the final version. The program obviously benefits Microsoft because no matter how well we test the product, it is real customer workloads that determine release quality. Our select customers benefit because they are assured that their workloads work well on the upcoming release, and they have the opportunity to work closely with the SQL Server engineering team.

Microsoft SQL Server 2014 is now generally available, and we believe you will enjoy this release for its exciting features: In-Memory OLTP; Always-On enhancements, including new hybrid capabilities; Column Store enhancements; cardinality estimate improvements, and much more. I also believe you will be happy with my favorite feature of all, and that is “reliability.” For an overview on the new features in SQL Server 2014, see the general release announcement.

To give you a better feel for this pre-release customer validation program, I will describe a few examples of customer workloads tested against SQL Server 2014 prior to the release of the product for general availability.

The first customer example is the world’s largest regulated online gaming company. Hundreds of thousands of people visit this company’s website every day, placing more than a million bets on a range of sports, casino games, and poker. SQL Server 2014 enables this customer to scale its applications to 250k requests per second, a 16x increase from the 16k requests per second on a previous version of SQL Server, using the same hardware. In fact, due to performance gains, they were able to reduce the number of servers running SQL Server from eighteen to one, simplifying the overall data infrastructure significantly. The transaction workload is session state of the online user, which not only has to manage tens of thousands of customers, it needs to respond quickly and be available at all times to ensure high customer satisfaction. The session state, written in ASP.NET, uses heavily accessed SQL Server tables that are now defined as “memory-optimized,” which is part of one of the new exciting capabilities of SQL Server 2014, In-Memory OLTP. The performance gain significantly improves the user’s experience and enables a simpler data infrastructure. No application logic changes were required in order to get this significant performance bump. This customer’s experience with SQL Server 2014 performance and reliability was so good, they went into production more than a year before we released the product.

The second customer example is a leading global provider of financial trading services, exchange technology, and market insight. Every year, the customer adds more than 500 terabytes of uncompressed data to its archives and has to perform analytics against this high volume of data. As you can imagine, this high volume of data not only costs a lot to store on disk, it can take a long time to query and maintain. To give you a sense of scale of this customer’s data volume, let me give you a few examples: one of the financial systems processes up to a billion transactions in a single trading day; a different system can process up to a million transactions per second; the data currently collected is nearly two petabytes of historical data. The cost savings on storage of 500+ terabytes of data, now compressed by ~8x using SQL Server 2014 in-memory columnstore for data warehousing indexes, provides an easy justification to upgrade, especially now that the in-memory columnstore is updatable. Significantly faster query execution is achieved due to the reduction in IO, another benefit of the updatable columnstore indexes and compressed data. This customer deployed SQL Server 2014 in a production environment for several months prior to general availability of the product.

My third example is a customer that provides data services to manufacturing and retail companies; the data services enable such companies to better market and sell more product. The closer this data services company can get to providing real-time data services, the more customers their partners can reach and the better customer satisfaction their partners can provide, when using the service. Before SQL Server 2014, the data services company designed their application utilizing cache and other techniques to ensure data (e.g., a product catalog) was readily available for customers. In this scenario, processing speed is important, and even more important than speed is data quality or “freshness,” so if the database can provide faster access to data persisted in the database rather than a copy in a cache, this ensures the data is more accurate and relevant. SQL Server 2014 In-Memory OLTP technology enables them to eliminate the application-tier cache and to scale reads and writes within the database. Data load performance improved 7x–11x. The In-Memory OLTP technology, by eliminating locking/latching, removed any lock contention that they might have previously experienced on read/write options to the database. The performance gains were so compelling, this company went into production with SQL Server 2014 four months prior to general release.

The Technology Adoption Program (TAP) is a great way to help all of us ensure that the final product has a proven high-quality track record when released. These three customers—and as many as a hundred others—have partnered with the SQL Server engineering team to ensure that SQL Server 2014 is well tested and high quality—maybe you can sleep a little better at night knowing you are NOT the first.

We are excited by the release of SQL Server 2014; check it out here.

 

Mark Souza
General Manager
Microsoft Azure Customer Advisory Team

Microsoft adds forecasting capabilities to Power BI for O365

The PASS Business Analytics Conference — the event where big data meets business analytics – kicked off today in San Jose. Microsoft Technical Fellow Amir Netz and Microsoft Partner Director Kamal Hathi delivered the opening keynote, where they highlighted our customer momentum, showcased business analytics capabilities including a new feature update to Power BI for Office 365 and spoke more broadly on what it takes to build a data culture.

To realize the greatest value from their data, businesses need familiar tools that empower all their employees to make decisions informed by data. By delivering powerful analytics capabilities in Excel and deploying business intelligence solutions in the cloud through Office 365, we are reducing the barriers for companies to analyze, share and gain insight from data. Our customers have been responding to this approach through rapid adoption of our business analytics solutions — millions of users are utilizing our BI capabilities in Excel and thousands of companies have activated Power BI for Office 365 tenants.

One example of how our customers are using our business analytics tools is MediaCom, a global advertising agency which is using our technology to optimize performance and “spend” across their media campaigns utilizing data from third party vendors. With Power BI for Office 365, the company now has a unified dashboard for real-time data analysis, can share reports, and can ask natural-language questions that instantly return answers in the form of charts and graphs. MediaCom now anticipates analyses in days versus weeks and productivity gains that can add millions of dollars in value per campaign.

One of the reasons we’re experiencing strong customer adoption is because of our increased pace of delivery and regular service updates. Earlier this week we released updates for the Power Query add-in for Excel and today we are announcing the availability of forecasting capabilities in Power BI for Office 365. With forecasting users can predict their data series forward in interactive charts and reports. With these new Power BI capabilities, users can explore the forecasted results, adjust for seasonality and outliers, view result ranges at different confidence levels, and hindcast to view how the model would have predicted recent results.  

In the keynote we also discussed how we will continue to innovate to enable better user experiences through touch-optimized capabilities for data exploration. We are also working with our customers to make their existing on-premises investments “cloud-ready”, including the ability for customers to run their SQL Server Reporting Services and SQL Server Analysis Services reports and cubes in the cloud against on-premises data. For cross-platform mobile access across all devices we will add new features to make HTML5 the default experience for Power View.

To learn more about the new forecasting capabilities in Power BI for O365, go here. If you’re attending the PASS Business Analytics Conference this week, be sure to stop by the Microsoft booth to see our impressive Power BI demos and attend some of the exciting sessions we’re presenting at the event. 

Progressive Insurance data performance grows by factor of four, fueling business growth online experience

At the Accelerate your Insights event last week, Quentin Clark described how SQL Server 2014 was now part of a platform that had in-built in-memory technology across all data workloads.  In particular with this release Microsoft has added in-memory Online Transaction Processing delivering breakthrough performance for applications in throughput and latency.

One of the early adopter customers of this technology is Progressive Insurance, a company that has long made customer service a competitive strength.  Central to customer service experience is the company’s policy-serving web app.  As it updated the app, Progressive planned to add its Special Lines business such as insuring motorcycles, recreational vehicles, boats, and even Segway electric scooters. However, Progressive needed to know that the additional workloads wouldn’t put a damper on the customer experience.

Progressive was interested in In-Memory OLTP capability, which can host online transaction processing (OLTP) tables and databases in a server’s working memory. The company tested In-Memory OLTP even before SQL Server 2014 became commercially available. Modifying the policy-serving app for the test was relatively straightforward, according to Craig Lanford, IT Manager at Progressive. 

The company modified eight natively compiled stored procedures, using already-documented code. In those tests, In-Memory OLTP boosted the processing rate from 5,000 transactions per second to 21,000—a 320 percent increase.

Lanford and his colleagues were delighted that the session-state database performance proved four times as fast with SQL Server 2014, adding “Our IT leadership team gave us the numbers we had to meet to support the increased database workload, and we far exceeded those numbers using Microsoft In-Memory OLTP”.  The company will use the throughput gain to support the addition of its Special Lines business to its policy-servicing app and session-state database. With its use of SQL Server 2014, Progressive can run a single, larger database reliably and avoid the cost of multiple databases.

You can read more about how Progressive is using SQL Server 2014 here.

Whether you’ve already built a data culture in your organization, or if you’re new to exploring how you can turn insights into action, try the latest enhancements to these various technologies: SQL Server 2014, Power BI for Office 365, Microsoft Azure HDInsight, and the Microsoft Analytics Platform System

Customers using Microsoft technologies to accelerate their insights

At yesterday’s Accelerate your insights event in San Francisco, we heard from CEO Satya Nadella, COO Kevin Turner and CVP Quentin Clark about how building a data culture in your company is critical to success. By combining data-driven DNA with the right analytics tools, anyone can transform data into action.

Many companies, and many of our customers, are already experiencing the power of data – taking advantage of the fastest performance for their critical apps, and revealing insights from all their data, big and small.

Since SQL Server 2014 was released to manufacturing in April we’ve seen many stories featuring the new technical innovations in the product.  In-memory transaction processing (In-Memory OLTP), speeds up an already very fast experience by delivering speed improvement of typically up to 30x.  Korean entertainment giant CJ E&M is using In-Memory OLTP to attract more customers for its games by holding online giveaway events for digital accessories like character costumes and decorations soon after each game is released.   When it ran tests in an actual operational environment for one of its most popular games, the results were that SQL Server 2014 delivered 35-times-faster performance over the 2012 version in both batch requests per second and I/O throughput. 

SQL Server 2014 also delivers enhanced performance to data warehouse storage and query performance – NASDAQ OMX is using the In-Memory Columnstore for a particular system which handles billions of transactions per day, multiple petabytes of online data and has single tables with quintillions of records of business transactions.  They have seen storage reduced by 50% and some query times reduced from days to minutes. 

Lufthansa Systems is using the hybrid features of SQL 2014 to anticipate customer needs for high-availability and disaster-recovery solutions.  It has piloted the combined power of Microsoft SQL Server 2014 and Windows Azure has led to even faster and fuller data recovery, reduced costs, and the potential for a vastly increased focus on customer service and solutions, compared with the company’s current solutions.

Growth in data volumes provides multiple challenges and opportunities.  For executives and researchers at Oslo University Hospital providing ease of access to data is important.  Using Power BI for Office 365, they can analyze data in hours rather than months, collaborate with colleagues around the country, and avoid traditional BI costs.  For Virginia Tech the data deluge presents challenges for researchers in the life sciences where new types of unstructured data types from gene sequencing machines are generating petabytes of data.  They are using the power of the cloud with Microsoft Azure HDInsight to not only analyzing data faster, but analyzing it more intelligently and which may in the future provide cures for cancer.  For The Royal Bank of Scotland handling multiple terabytes of data and an unprecedented level of query complexity more efficiently led them to use the power of the Analytics Platform System (formerly Parallel Data Warehouse).  As a result, it gained near-real-time insight into customers’ business needs as well as emerging economic trends, cut a typical four-hour query to less than 15 seconds, and simplified deployment. 

Whether you’ve already built a data culture in your organization, or if you’re new to exploring how you can turn insights into action, try the latest enhancements to these various technologies: SQL Server 2014, Power BI for Office 365, Microsoft Azure HDInsight, and the Microsoft Analytics Platform System.

The data platform for a new era

Earlier today, Microsoft hosted a customer event in San Francisco where I joined CEO Satya Nadella and COO Kevin Turner to share our perspective on the role of data in business. Satya outlined his vision of a platform built for an era of ambient intelligence. He also stressed the importance of a “data culture” that encourages curiosity, action and experimentation – one that is supported by technology solutions that put data within reach of everyone and every organization. 

Kevin shared how customers like Beth Israel Deaconess Medical Center, Condé Nast, Edgenet, KUKA systems, NASDAQ, telent, Virginia Tech and Xerox are putting Microsoft’s platform to work and driving real business results. He highlighted an IDC study on the tremendous opportunity for organizations to realize an additional $1.6 trillion dividend over the next four years by taking a comprehensive approach to data. According to the research, businesses that pull together multiple data sources, use new types of analytics tools and push insights to more people across their organizations at the right time, stand to dramatically increase their top-line revenues, cut costs and improve productivity. 

A platform centered on people, data and analytics
In my keynote, I talked about the platform required to achieve the data culture and realize the returns on the data dividend – a platform for data, analytics and people. 

It’s people asking questions about data that’s the starting point — Power BI for Office 365 and Excel’s business intelligence features helps get them there. Data is key – data from all kinds of sources, including SQL Server, Azure and accessibility of the world’s data from Excel. Analytics brings order and sets up insights from broad data – analytics from SQL Server and Power BI for Office 365, and Azure HDInsight for running Hadoop in the cloud.

A platform that solves for people, data, and analytics accelerates with in-memory. We created the platform as customers are increasingly needing the technology to scale with big data, and accelerate their insights at the speed of modern business. 

Having in-memory across the whole data platform creates speed that is revolutionary on its own, and with SQL Server we built it into the product that customers already know and have widely deployed. At the event we celebrated the launch of SQL Server 2014. With this version we now have in-memory capabilities across all data workloads delivering breakthrough performance for applications in throughput and latency. Our relational database in SQL Server has been handling data warehouse workloads in the terabytes to petabyte scale using in-memory columnar data management. With the release of SQL Server 2014, we have added in-memory Online Transaction Processing. In-memory technology has been allowing users to manipulate millions of records at the speed of thought, and scaling analytics solutions to billions of records in SQL Server Analysis Services. 

The platform for people, data and analytics needs to be where the data and the people are. Our on-premises and cloud solutions provide endpoints for a continuum of how the realities of business manage data and experiences – making hybrid a part of every customer’s capability. Today we announced that our Analytics Platform System is generally available – this is the evolution of the Parallel Data Warehouse product that now supports the ability to query across the traditional relational data warehouse and data stored in a Hadoop region – either in the appliance or in a separate Hadoop cluster. SQL Server has seamless integration with VMs in Azure to provide secondaries for high availability and disaster recovery. The data people access in the business intelligence experience comes through Excel from their own data and partner data – and Power BI provides accessibility to wherever the data resides.  

The platform for people, data and analytics needs to have full reach. The natural language search query Q&A feature in Power BI for Office 365 is significant in that it provides data insights to anyone that is curious enough to ask a question. We have changed who is able to reach insights by not demanding that everyone learn the vernacular of schemas and chart types. With SQL Server, the most widely-deployed database on the planet, we have many people who already have the skills to take advantage of all the capabilities of the platform. With a billion people who know how to use Excel, people have the skills to get engaged on the data.

Looking forward, we will be very busy. Satya mentioned some work we are doing in the Machine Learning space and today we also announced a preview of Intelligent Systems Service – just a couple of the things we are working to deliver a platform for the era of ambient intelligence. The Machine Learning work originates in what it takes to run services at Microsoft like Bing. We had to transform ML from a deep vertical domain into an engineering capability, and in doing so learned what it would take to democratize ML for our customers. Stay tuned. 

The Internet of Things (IoT) space is very clearly one of the most important trends in data today. Not only do we envision the data from IoT solutions being well served by the data platform, but we need to ensure the end-to-end solution can be realized by any customer. To that end, Intelligent Systems Service (ISS) is an Internet of Things offering built on Azure, which makes it easier to securely connect, manage, capture and transform machine-generated data regardless of the operating system platform.

It takes a data platform built for the era of ambient intelligence with data, analytics and people to let companies get the most value from their data and realize a data culture. I believe Microsoft is uniquely positioned to provide this platform – through the speed of in-memory, our cloud and our reach. Built on the world’s most widely-deployed database, connected to the cloud through Azure, delivering insights to billions through Office and understanding the world through our new IoT service – it is truly a data platform for a new era. When you put it all together only Microsoft is bringing that comprehensive a platform and that much value to our customers.

 

Quentin Clark
Corporate Vice President
Data Platform Group

The SQL Server Community Looks to Emerging Trends in BI and Database Technologies

At PASS Summit this year Ted Kummert outlined his views on accelerating insights in the new world of data.  He mentioned in his blog post, that this is an incredible time for the industry, and that data has emerged as the new currency of business.

Given that it’s such an exciting time to be in the industry, we thought this would be an ideal opportunity to ask some of the SQL Server community members attending PASS about what issues from the past they are glad are behind them, and about what industry and technology trends they are looking forward to in the future.

The answers from community members on what future trends they are most interested in were extremely diverse, including topics such as big data, new data visualizations, in-memory technologies and cloud-based & hybrid architectures. Watch the full video below to hear what the SQL Server community had to say.

Incidentally, many of the people featured in the video have already worked on published SQL Server 2012 customer stories.  You can find a complete list of these case studies at www.microsoft.com/sqlcustomers.

Crutchfield Turns to Microsoft and EMC to Help Transform SQL Server to the Private Cloud

When it comes to consumer electronics gear, audio and video enthusiasts rely on Crutchfield Corporation for excellent customer service and stellar product know-how. Crutchfield powers its information-based service with a wide range of tools for its website visitors, customers and internal customer advisors. To keep improving their stellar customer service, Crutchfield develops most of its line of business applications in-house – many leveraging SQL Server. In recent years, an expanding set of applications led to rampant data and server growth in its data center. 

To address this challenge, Crutchfield looked to EMC and Microsoft technologies. Already a user of Microsoft technologies, Crutchfield was able to virtualize 75% of its EMC storage infrastructure using Microsoft Windows Server Hyper-V.  

Using EMC and Microsoft technologies, Crutchfield was able to:

  • Save a total of $500,000 through virtualization using Windows Server Hyper-V
  • Drive applications to market 20% faster
  • Decrease SQL Server disk read times and latency from 5 – 10 milliseconds to less than one millisecond
  • Improve storage utilization from 40 to 80 percent.

Watch Crutchfield Information Systems Manager of Enterprise Storage Craig VanHuss discuss how the company worked with EMC and Microsoft to transform its Microsoft applications, speeding performance and increasing efficiency, in this video.