Monthly Archives: April 2013

Little Data Remains Important in Healthcare IT

In his article Healthcare’s Big Problem With Little Data , author Dan Munro raises salient points about the state of health-related data. Electronic Health Records (EHR) were promoted as the end-all-be-all solution for the industry – a standardization that, I suppose, many thought would organically and naturally occur, stabilize, and be maintained. It hasn’t. At least not yet. My doctor and I speak about this almost each time I visit with him. The corporation that operates his practice nowadays seems…(read more)

The “Consumerization” of IT and the Dark Side of the Cloud

Cloud computing is actually being largely driven by the “Consumerization of IT”. That phrase, as grammatically incorrect as it is, represents a fundamental change to the way businesses think about technology, and subsequently how the IT team provides it.

Years ago, technology was introduced by the office. No one owned a mainframe at home of course, and even in the early years of PC’s few people could afford to have them in their houses. Other than game consoles and hobbyists on small computers, most full-up “PC’s” were used for  work. 

That rapidly changed, with the lowering of costs and miniaturization of technology. PC’s and then laptops became ubiquitous in the home, and of course the “smart phone” ushered in an entire generation where the technology available to the consumer outpaced what is installed at the place of work. Many of us have laptops  that are more powerful than some of the servers the company uses in some applications.

IT as a department grew up in the era of the “office-first” technology. Modern users, especially those controlling the budget, are now more “home-first” technology buyers. In extreme cases, I’ve seen IT departments relegated to maintenance of legacy systems, with new IT projects being scoped, designed and run by business teams – usually on a Cloud Computing platform. The business wants to create a technical solution as quickly as they can download an app to their phone. They want the same level of speed and ease that they have on home technology in their business work.

However, this can be problematic if not thought through. As with any new technology, Cloud Computing provides both benefits and concerns. It’s true that almost anyone can quickly stand up a server or deploy an application quickly with nothing more than an e-mail address and a credit card. But business teams are not always aware of areas such as security or similar concerns that the IT teams solved through many hours of careful planning. Unfortunately, it’s often a matter of “Ready, Fire, Aim.”

So what is the business (who wants the agility of a smart phone and a single-click solution) to do? What about the need for security, strategic design, integration and all of the other functions that IT needs to handle? This is where I think Windows Azure (not to be too sales-y) handles the situation well.

If you’re using another cloud provider, by the way, that’s fine. The concepts here are the same.

Microsoft sells an on-premises operating system, and has done for many years. We’ve architected Windows Azure Virtual Machines, Active Directory Services, Platform-as-a-Service, and even the Hadoop and other offerings to work together – and with the tools that you use to manage them today, like System Center and PowerShell.

   To the business team, I say this:

  • Work with your IT staff on projects, even if you’re designing the project and paying for it – the IT professionals can keep you out of danger. Most of them have made the mistakes you’re going to make, and know what to do to avoid them.
  • Plan for the future – “This is just a proof-of-concept” project becomes productions in a frighteningly quick period of time.
  • Understand the cost model – a good architect can solve one problem in multiple ways, and cost is always a vector. The IT team can help you with this – they have the relationships with the vendors to consolidate and help you understand those costs.

     To the IT team, I have this advice:

  • Don’t stand in the way of the business – they’ll just go around you. Work with them.  Enable the business to do what they need, when they need it, and they’ll work with you. I’ve seen both results when I witnessed the mainframe-to-the-PC transition, and I’m seeing it again in the PC-to-the-cloud transition. Change is inevitable – get on board or become irrelevant to the people who pay your salary.
  • Learn the cloud. Talk to your vendor, get training, read up, ask questions. If this bothers the vendor, get a different one.
  • Create a self-service portal. This point may be the most important one. Become your own “Cloud”, and your users won’t need to go elsewhere.  I’ll talk more about how to do this in another post.

 

In the end, the relationship between IT teams and Business is eerily similar to a marriage – it’s an amazing thing, it takes a lot of work to get right, and the “Consumerization of IT” is that cute person at the end of the bar.Work together or one of you will soon be with somebody new.

PowerPivot Workbook Size Optimizer #powerpivot #tabular

Microsoft released the Workbook Size Optimizer for Excel, the first version of an Excel add-in for Excel 2013 that inspects the data model and suggest possible optimizations. Fundamentally, it tries to apply the best practices descripted in a white paper I mentioned a few weeks ago, removing useless columns and changing granularity to those that could reduce the overall memory cost of a table.

imageThere are different setup available in the download page, depending on operating system (Windows 7 or Windows 8) and on Office version (32 or 64 bit). Once installed, you have a new tab in the Excel ribbon, called Workbook Size Optimizer, showing a single button that starts a wizard.

I tried to run the optimizer with a workbook where I imported several tables from Adventure Works Data Warehouse sample database. The first page shows a few information about the workbook size and the option of automatic detection or manual choice of rules. The latter is an option you can request also later, so I started with the default.

image

After a short analysis, I received three smart suggestions (considered the model I have). We might wonder that removing UnitCost is a smart thing, because it could be required in order to perform calculations and rounding the value might be not correct for our analysis.

image

Since I requested to apply some changes, I have the option of changing which rules to apply. This corresponds to the choice you have if you choose “Let me choose the rules myself” in the first screen of the wizard.

image

I kept all the rules and after I click Next I had to wait several seconds in order to complete the optimization process. The result shows a few information about the result of the job.

image

This is a good starting point. Don’t blindly trust any suggestion and try to consider carefully the rules to apply in order to avoid losing important data for your analysis. Moreover, you might have a better knowledge of your data model than a wizard and consider the deletion of many useless columns (for your analysis) that are not identified by the wizard. My article Checklist for Memory Optimizations in PowerPivot and Tabular Models contains several best practices that you can apply to your data model.

Using PowerShell to access event logs for SQL Server

This tip will introduce a few PowerShell cmdlets related to accessing and handling Windows event logs. The event logs capture various system events that occur for both Windows and specific applications like SQL Server. If anything goes wrong with your SQL Server box then the event logs would be one of the first places to look to help troubleshoot the issue. In this tip I will explain how to access the event logs using Windows PowerShell cmdlets.

Problems using BETWEEN

The BETWEEN operator is a handy SQL construct, but it can cause unexpected results when it isn’t understood. Consider the following code snippet: where x between .9 and 1.10 One of the questions you should ask is this: What is x? What if x has a float, real, or double data type? These data types do not store exact representations of numbers, only approximations. When 0.9 is stored in a real column or variable, it may be between 0.9 and 1.1. Or it may not. When you set a real to 0.9, internally it…(read more)

Relativity e-discovery on SQL Server

Back in late 2011 to early 2012, I was asked to look into issues for a SQL Server system supporting kCura Relativity. Relativity is an e-discovery platform, that is, a document search management system frequently used for document discovery is legal cases. So it also has auditing to prove that a search was done. Normally I would post items both here and on my own website. Now it is my nature to say things that others find to be not entirely polite (see my previous post) and I am too old to change….(read more)

Viewing and Interacting with SSRS Reports on an iPad

This video demonstrates how to view a Reporting Services report on an Apple iPad that has Apple iOS 6 and Apple Safari. You’ll learn how to access reports from the report server and from email, use touch to collapse and expand row groups, sort columns, and filter data using parameters, and export the report to different formats.

For more information about viewing reports on mobile devices, see View Reporting Services Reports on Microsoft Surface Devices and Apple iOS Devices.

For an overview of browser support for Reporting Services and Power View, see Planning for Reporting Services and Power View Browser Support.

Issues replicating XML data types for databases in SQL Server 2000 compatibility mode

We have a legacy application that was not certified for SQL 2005, so we were running it on a SQL 2005 server using SQL 2000 compatibility mode. The in house development team is increasing functionality and created a table with XML data types. The problem is that this table also needs to be replicated and now replication fails. In this tip we look at how to resolve this issue.