A current trend that I can't say that I love is constant churn within software applications. I have no interest to go back to the days where we got a new version of SQL Server or Power BI, etc. every few years.
In fact, I thrive on change. However, I've now become really concerned about how anyone:
Learns to use a complex application
Remembers how to use a complex application when they don't use it daily
I first really struck this issue with Azure. If I was teaching a class that used Azure, I could check every single lab on Sunday night, then Monday morning, the students would find it had all changed. That's OK for an experienced person, but not OK for a learner.
I love the rate of change for Power BI. We're endlessly getting wonderful new things. But I have to say that every class that I teach on this is like a new experience. I've got another one this coming Tuesday. I used to look forward to them but now I have a major hesitation every time, as I wonder what parts of the labs will have broken.
This is now an ongoing challenge for all this type of software though. I helped create some labs for VSTS very recently, and when I look at the product now, it barely resembles the one that I built the labs on.
Is it better? Probably yes.
But even though it might have been a few months ago, it feels like just the other week, and yet, not only has the UI changed, entire concepts have been added or removed, and the order that things are done in has changed substantially.
I don't know the answer to this but the current rate of churn is a substantial issue.
I gather the plan with the DevOps guys is to put a set of labs on GitHub, and let people who are doing the labs point out the issues day by day as they strike them. Again, for experienced users that might work. But for newcomers, I really wonder if that's what they'll think.
Will they realize the app must have changed, and it's all different, or will they just think the product is too hard to use. Either way, they'll be very frustrated.
And while initial learning the product is one thing, I'm worried about it longer-term. A product like VSTS lets you set up automation and you hope you won't need to change it constantly. But if every time you go to make a change, you're struggling to use it like you're a newbie again, that's a problem.
Finally, I'm really concerned about ongoing support.
The vast majority of support of software applications today happens from community resources like blogs, webcasts, etc.
Will they continue to be created at the same pace if the authors know they'll be irrelevant or wrong within a very short time? How will end-users learn to do things when none of the online examples they find still work?
If you have ever tried to write scripts to configure operating systems, you'd realize just how hard that is to get correct.
For example, if you need to ensure that your virtual machine has two network adapters, and they need to have a specific set of IP addresses, how do you do that?
The traditional approach for PowerShell (and similar tools) was to try to write a step-by-step script to configure the network adapters the way you want. But where do you start? Do you write a script to check for any existing adapters and loop through them to try to remove them? Do you try to add the ones you want, and then remove the others?
You'll quickly realize that you get into very, very complex coding because you might not be able to be sure what your starting point is. Everything has to stay on the path that you prescribed.
And worse, what happens if you run this script more than once?
That's the problem with imperative code. You are telling the operating system the steps required for configuration.
We don't want to be doing this.
With a declarative approach, we tell the system how we want it to end up ie: the desired state, and let it worry about how to get configured that way. This is what we want to be doing instead of writing imperative code.
PowerShell offers DSC (desired state configuration) where you describe in a JSON file, the way you want the configuration to end up. A bonus in this approach is that it's idempotent ie: no matter how many times you run it, you end up with the same outcome.
It's important wherever possible to be doing declarative configuration not imperative configuration. In later posts, I'll talk more about how.
I’ve worked with T-SQL for a very long time, and over the years I’ve lost count of how many tools I’ve found the need to create to help me in my work.
They have been in a variety of script folders, etc. and whenever I go to use them now, I often have to decide which is the best version of a particular tool, as they’ve also been refined over time.
So I decided to get them into a clean clear shape and SDU Tools was born. I’ve grabbed a whole bunch of tools for a start, made sure they are pretty consistent in the way they do things, and published them within a single script. I figured I might as well also share them with others. They are free (I get you to register so I know what usage they are getting).
For each of the tools, I’ve also created a YouTube video that shows how to use the tool. I’m also planning to create blog posts for each tool so I have the opportunity to show things that won’t fit in a short video and ways of using them that might be less obvious.
I’ve got a big backlog of tools that I’d like to add so I’m intending to add in whichever ones I get reworked during each month. Likely areas in the upcoming months are further functions and procedures related to scripting objects, and also to code generation.
The tools ship as a single T-SQL script, don’t require SQLCLR or anything to clever, and are targeted at SQL Server 2008 and above. They are shipped as a single schema that you can place in whichever database (or databases) suits you best.
Of course there’s the usual rider that you use them at your own risk. If they don’t do what they should, I’ll be sad and then fix it, but that’s all
I hope you find them useful.
You can find out more about our free SDU Tools here:
There’s an interesting scenario that’s come up in creating some new samples for SQL Server 2016.
I know that for many, many years people have asked for a way to limit developer edition to only use standard edition features, or at least to have a way to flag them. I’ve often thought that I’d like a way to set a “target environment” and have the colorization change anything that I can’t use in standard to an obviously different color.
However, previously, if you used developer edition to create a database, as long as you didn’t use any enterprise features, you could then backup and restore that database to a standard edition server.
That’s no longer the case with SQL Server 2016.
If you create a temporal table in a database on developer edition, it automatically chooses PAGE compression for the history file. There is also no DDL to avoid this. The database can no longer be restored on a standard edition server. To work around it, you would need to rebuild the table without compression after it’s created.
I see this as quite a change in behavior. I don’t think that features that are only available in enterprise (or other perhaps Azure) editions should be “auto-injected”.
Clearly, if you script out the database and create a new database on standard edition using the script, it will create the history table without compression. But I suspect that many people create DBs as starter DBs using developer edition, and then deploy them to other editions that don’t support compression.
I’d be interested in thoughts on how common this practice currently is.
In recent months, I’ve been brushing up my R skills. I’ve had a few areas of interest in this:
* R in Azure Machine Learning
* R in relation to Power BI and general analytics
* R embedded (somewhat) in SQL Server 2016
As a client tool, I’ve been using RStudio. It’s been good and very simple but it’s a completely separate environment. So I was excited when I saw there was to be a preview of new R tooling for Visual Studio.
I’ve been using a pre-release version of R Tools for Visual Studio for a short while but I’ve already come to quite like it. It’s great to have this embedded directly within Visual Studio. I can do everything that I used to do in RStudio but really like the level of Intellisense, etc. that I pick up when I’m working in R Tools for Visual Studio.