SQL 2016 and R – Working Directory

It’s been a busy short week, with the Thanksgiving holiday, putting up Christmas decorations, and the time consuming task I like to call my job. But I did start to play with the rlauncher config file. Not much, just in setting the workspace R will be using.

This is the line that defines the default workspace

WORKING_DIRECTORY=C:\PROGRA~1\MICROS~2\MSSQL1~1.MSS\MSSQL\EXTENS~1 

Books Online has a note that warns not to change the value. But hey, this is just a CTP, right? Why not test that out? So I changed the value to a new folder, C:\SQL2016\RWorkspace. Afterwards,checking the working directory didn’t give me the result I was expecting.

image

Not only that, I got a different result each time I ran the script. Each time a new sub-directory was created.

image

I reset the value of the working directory back to the default. And I’ll leave it as is. If I want to change the working directory, maybe to store functions I can source out later, I’ll change it in my script.

image

I’m not done with the rlauncher config yet, I’ll have more later.

Reference

Configure and Manage Advanced Analytics Extensions
https://msdn.microsoft.com/en-US/library/mt590869.aspx

SQL 2016 and R – Working With SQL Data

In the last few posts I’ve gone over installing the components you need to run R in a SQL 2016 instance as well as some simple scripts that played with one of the data sets that’s included with R. Today we’ll start working with data stored inside a SQL database.

Let’s start with a simple query against the AdventureWorks2016 database. We’re going to select the CustomerID and OrderDate columns from Sales.SalesOrderHeader and the UnitPrice column from Sales.SalesOrderDetail for the year 2014. Our R script just returns the same data that we supplied. The only thing I did differently this time is to separate my R and SQL scripts into separate variables. It looks cleaner to me this way but you may disagree. In any case the results are the same.

DECLARE @RScript nvarchar(max)
SET @RScript = N'OutputDataSet <- InputDataSet'

DECLARE @SQLScript nvarchar(max)
SET @SQLScript = N'SELECT soh.CustomerID, soh.OrderDate, sod.UnitPrice
FROM Sales.SalesOrderHeader AS soh
INNER JOIN Sales.SalesOrderDetail AS sod ON soh.SalesOrderID = sod.SalesOrderID
WHERE YEAR(soh.OrderDate) = 2014'

EXECUTE sp_execute_external_script
@language = N'R',
@script = @RScript,
@input_data_1 = @SQLScript
WITH RESULT SETS ((CustomerID varchar(20), OrderDate datetime, AvgUnitPrice money))

image

However you’re probably not going to use R to echo the incoming data, you’ll probably want to shape it somehow. We’ll supply the same dataset to R but this time we’ll convert the integer CustomerID to an R factor. We’ll create 2 new variables inside R. Col1 will be a list of all CustomerIDs. Using the tapply function will assign the average Unit Price by Customer ID to col2. Then we’ll return both columns back to SQL as a data frame.

DECLARE @RScript nvarchar(max)
SET @RScript = N'df <- InputDataSet
df$CustomerID <- as.factor(df$CustomerID)
col1 <- levels(df$CustomerID)
col2 <- tapply(df$UnitPrice, df$CustomerID, mean)
OutputDataSet <- data.frame(col1, col2)'

DECLARE @SQLScript nvarchar(max)
SET @SQLScript = N'SELECT soh.CustomerID, soh.OrderDate, sod.UnitPrice
FROM Sales.SalesOrderHeader AS soh
INNER JOIN Sales.SalesOrderDetail AS sod ON soh.SalesOrderID = sod.SalesOrderID
WHERE YEAR(soh.OrderDate) = 2014'

EXECUTE sp_execute_external_script
@language = N'R',
@script = @RScript,
@input_data_1 = @SQLScript
WITH RESULT SETS ((CustomerID varchar(20), AvgUnitPrice money))

image

Again, this was pretty simple. We could have done the same thing with a group by clause in our SQL script. I just want to show that you can apply R functions to the SQL dataset you pass in.

Let’s do another, similar example. This time we won’t filter OrderDate by year, instead we’ll work with all orders. We’ll add a column for the month the order took place, then we’ll find the average unit price for each month.

DECLARE @RScript nvarchar(max)
SET @RScript = N'df <- InputDataSet
df$month <- factor(format(df$OrderDate, "%B"), levels = month.name)
col1 <- levels(df$month)
col2 <- tapply(df$UnitPrice, df$month, mean)
OutputDataSet <- data.frame(col1, col2)'

DECLARE @SQLScript nvarchar(max)
SET @SQLScript = N'SELECT soh.CustomerID, soh.OrderDate, sod.UnitPrice
FROM Sales.SalesOrderHeader AS soh
INNER JOIN Sales.SalesOrderDetail AS sod ON soh.SalesOrderID = sod.SalesOrderID'

EXECUTE sp_execute_external_script
@language = N'R',
@script = @RScript,
@input_data_1 = @SQLScript
WITH RESULT SETS (([Month] varchar(20), AvgUnitPrice money))

image

Let’s do one last example. This time we’ll return a simple barplot showing the month average of unit prices. We’ll load the ggplot2 library for this one, and we’ll return the plot as a varbinary(max). We’ll also wrap it inside a stored procedure.

CREATE PROCEDURE PlotMonthlyAverage
AS
DECLARE @RScript nvarchar(max)
SET @RScript = N'
library(ggplot2)
image_file <- tempfile()
jpeg(filename = image_file, width = 500, height = 500)
df <- InputDataSet
df$month <- factor(format(df$OrderDate, "%B"), levels = month.name)
col1 <- levels(df$month)
col2 <- tapply(df$UnitPrice, df$month, mean)
monthAvg <- data.frame(col1, col2)
names(monthAvg) <- c("Month", "AvgUnitPrice")
print(ggplot(monthAvg, aes(x=Month, y=AvgUnitPrice)) + geom_bar(stat="identity", fill="lightblue", colour="black"))
dev.off()
OutputDataSet <- data.frame(data=readBin(file(image_file, "rb"), what=raw(), dbo.PlotMonthlyn=1e6))'

DECLARE @SQLScript nvarchar(max)
SET @SQLScript = N'SELECT soh.CustomerID, soh.OrderDate, sod.UnitPrice
FROM Sales.SalesOrderHeader AS soh
INNER JOIN Sales.SalesOrderDetail AS sod ON soh.SalesOrderID = sod.SalesOrderID'

EXECUTE sp_execute_external_script
@language = N'R',
@script = @RScript,
@input_data_1 = @SQLScript
WITH RESULT SETS ((BarPlot varbinary(max)))
GO

When you execute the procedure in SSMS you won’t see the bar plot…image

…but you can if you call the procedure from a report…

image

…or you can have even more fun and save the plot inside a FileTable!

DECLARE @Plot TABLE (col1 varbinary(max))

INSERT INTO @Plot (col1)
EXECUTE dbo.PlotMonthlyAverage

INSERT INTO DemoDB.dbo.documents(name, file_stream)
SELECT 'Plot.jpg', col1
FROM @Plot

Plot

Full disclosure: I am not an “R”tist, nor am I a statistician. I merely dabble, as you can probably tell by my examples. I definitely need to work on the graph, especially the ordering on the x axis. But I need to save some material for future posts Smile

SQL 2016 and R – Some Basic Examples

OK, I’m itching to finally start working with R inside of SQL 2016. I’m running my examples on a SQL 2016 CTP3 instance. If you’re running CTP 2.4 or earlier these examples won’t work.

We’ll start out with a very basic example. Here the string “Hello World” is passed into the R script and assigned to OutputDataSet, which is what is returned to SQL.

EXECUTE sp_execute_external_script
@language = N'R',
@script = N'OutputDataSet <- InputDataSet',
@input_data_1 = N'SELECT ''Hello World'' AS col'
WITH RESULT SETS((col char (11)))

image

Another way to run the same script is to use a different variable inside the R script and return the value with the @output_data_1_name parameter;

EXECUTE sp_execute_external_script
@language = N'R',
@script = N'df <- InputDataSet',
@input_data_1 = N'SELECT ''Hello World 2'' AS col',
@output_data_1_name = N'df'
WITH RESULT SETS((col char (13)))

image

Of course we want to work with data, not some simple strings. Lets take a look at the iris data set that comes with R. We’ll limit the results to just the first 6 rows for now. We still need an input but we’ll just pass in a blank string.

EXECUTE sp_execute_external_script
   @language = N'R',
   @script = N'OutputDataSet <- head(iris)',
   @input_data_1 = N''
WITH RESULT SETS (([Sepal.Length] float, [Sepal.Width] float,
                   [Petal.Length] float, [Petal.Width] float,
                   [Species] varchar(25)));

image

This works because we already knew what columns to return and their data types. But what if we didn’t? In that case we’ll return a sting and use RESULT SETS UNDEFINED to view the makeup of the data set.

EXECUTE sp_execute_external_script
   @language = N'R',
   @script = N'OutputDataSet <- data.frame(str(iris))',
   @input_data_1 = N''
WITH RESULT SETS UNDEFINED;

image

So now we can see the columns and data types we need to return.

Here’s a very basic example of statistics and R. We’ll just get the mean value for each column.

EXECUTE sp_execute_external_script
   @language = N'R',
   @script = N'temp <- colMeans(iris[, 1:4])
               tempIris <- rbind(temp)
               OutputDataSet <- as.data.frame(tempIris)',
   @input_data_1 = N''
WITH RESULT SETS(([Sepal.Length] float, [Sepal.Width] float,
                  [Petal.Length] float, [Petal.Width] float));

image

And for the last example today, we’ll save the values back to a table in my DemoDB database. And instead of running the R script ad-hoc we’ll save it as a stored procedure.

USE DemoDB;
GO

CREATE TABLE dbo.irisMeans (
   SepalLength    float,
   SepalWidth     float,
   PetalLength    float,
   PetalWidth     float);
GO

CREATE PROCEDURE GetIrisMeans AS
EXECUTE sp_execute_external_script
   @language = N'R',
   @script = N'temp <- colMeans(iris[, 1:4])
               tempIris <- rbind(temp)
               OutputDataSet <- as.data.frame(tempIris)',
   @input_data_1 = N''
WITH RESULT SETS(([Sepal.Length] float, [Sepal.Width] float,
                  [Petal.Length] float, [Petal.Width] float));

INSERT INTO dbo.irisMeans(SepalLength, SepalWidth, PetalLength, PetalWidth)
EXECUTE GetIrisMeans;

SELECT * FROM dbo.irisMeans;

image

I’ll end my post here. In my next we’ll start to work with data from a database.

SQL 2016 and R – Some Quick Notes

I’ve got SQL 2016 CTP3 installed and I’ve been playing with R inside SQL. But before I start posting some examples I want to share my notes first.

  • Permissions:
    • I mentioned last week that you should add accounts to the db_rrerole to be able to run R scripts. However BOL says this is a temporary role for CTP3. You should grant the EXECUTE ANY EXTERNAL PERMISSION database permission.
  • Packages:
    • The default library path for R packages is C:/Program Files/RRO/RRO-3.2.2-for-RRE-7.5.0/R-3.2.2/library. This folder is read only after installation, you’ll need to set the folders write permissions if you want to install packages here.
    • If you don’t make the default library writable you can still install packages to a local directory. However I wasn’t able to call those packages from within a SQL procedure. I tried variations of this:
      library(“ggplot2”, lib=”C:\Users\John\MyRPackages”).
      But my syntax could be wrong, I’ll keep playing.
    • Books Online says you can install a package to the default directory by running a command prompt as an administrator, navigating to the default package location, then running install.packages. I haven’t been successful doing that yet, either.
    • There are no dynamic management objects to view installed packages yet (coming in a future release?), but you can still see what packages are available:
      EXECUTE sp_execute_external_script
      @language=N’R’
      , @script = N’str(OutputDataSet);
      packagematrix <-installed.packages();
      NameOnly <- packagematrix[,1];
      OutputDataSet <- as.data.frame(NameOnly);’
      , @input_data_1 = N’SELECT 1 as col’
      WITH RESULT SETS ((PackageName nvarchar(250) ))
  • sp_execute_external_script:
    • This is how you will execute R scripts inside of SQL.
    • It can be called ad-hoc or as part of a stored procedure
    • For now at least it will just return a single dataset. This will change in the future but maybe not by the full release of SQL 2016.
    • The first parameter is @language. BOL says it can be any registered language though right now that’s just R. Does that mean Python or other languages will be coming in the future?
    • At its very basic you need to supply @language and @script. OutputDataSet is the default for @output_data_1_name, InputDataSet is the default for @input_data_1_name.
    • You don’t need WITH RESULTS SETS if you’re not returning a data set. The result set definition must match R data types, bit, int, float, datetime,  and varchar. Use CAST on other data types.

Jen Stirrup has written a great blog post that compares R and SQL language. It’s the place to start if you’re familiar with one but not the other.

Learning pathway for SQL Server 2016 and R Part 2: Divided by a Common Language

References

sp_execute_external_script (Transact-SQL)
https://msdn.microsoft.com/en-US/library/mt604368.aspx

SQL 2016 R Services
https://msdn.microsoft.com/en-US/library/mt604845.aspx

Getting Started With SQL Server R Service
https://msdn.microsoft.com/en-US/library/mt604885.aspx

SQL 2016 CTP3 and Instant File Initialization

Sometimes you find things when you’re not looking.

I’ve downloaded and installed SQL 2016 CTPs a few times now, most recently being CTP3. It was while I was looking at the Server Configuration page that I noticed a new option. There’s now a checkbox to grant the Perform Volume Maintenance Task privilege to the account that runs the db engine service.

Now, this could have been on earlier CTPs, I just never noticed it. But it saves you from having to go into the Group Policy Editor and granting the permission manually. And, with the option to set multiple tempdb files during installation, it looks like Microsoft is letting you set up some best practices right from the start.

Of course, you should be familiar with why you’re choosing either option. Don’t just click the boxes. If you don’t know about instant file initialization or why you may need multiple tempdb files I’ve supplied a few links below so you can read up on both before installing SQL 2016.

References

Database Instant File Initialization
https://msdn.microsoft.com/en-us/library/ms175935.aspx

Instant File Initialization – What, Why, and How?
http://www.sqlskills.com/blogs/kimberly/instant-initialization-what-why-and-how/

Misconceptions around instant file initialization
http://www.sqlskills.com/blogs/paul/misconceptions-around-instant-file-initialization/

Correctly adding data files to tempdb
http://www.sqlskills.com/blogs/paul/correctly-adding-data-files-tempdb/

Blitz Results: Only One tempdb Data File
http://www.brentozar.com/blitz/tempdb-data-files/

First Look at R and SQL 2016 – Installing

Every new release of Microsoft’s SQL Server has at least one feature that I want to dive right into. In this edition it’s the ability to use R inside the database itself. At the recently concluded SQL PASS Summit Microsoft announced the release of SQL 2016 CTP 3. And this one has the R bits. I didn’t make it to the Summit this year, and I wasn’t at the earlier Ignite conference where it was first announced so I haven’t seen SQL and R in action yet. So while handing out candy for Halloween this past weekend I downloaded CTP 3 and installed it on a virtual machine. This post is just about installing R and SQL 2016 CTP3; later posts will go more into detail since I haven’t done much other than Hello World yet.

For all my virtual machines I’m running Hyper-V on a laptop that has Windows 10 64 bit as the OS and 16 GB of RAM and a 1 TB Samsung EVO 850 solid state drive. The VM that I built for CTP3 is running Windows 2016 Preview and I’ve allocated 8 GB RAM, 4 CPU and 100 GB of space for testing. The VM is part of my lab domain but it’s not clustered.

As you install SQL you’ll go through the same screens you’ve gone through in past editions. The first change will be on the Feature Selection page. You’ll need to choose Advanced Analytics Extensions to enable R to integrate with your SQL procedures. The .NET 3.5 Framework needs to be pre-installed, something that’s not done by default with Windows server. Choose whatever other features you want for your instance. It’s not required for R but if you pick PolyBase Quality Service for External Data you’ll have to install Oracle SE Java Runtime Environment Version 7 Update 51 or higher first.

image

After choosing your features and continuing the installation you’ll get the following warning on the Feature Rules page.

image

—————————

Rule Check Result

—————————

Rule “Post-installation steps are required for Advanced Analytics Extensions” generated a warning.

The feature Advanced Analytics Extensions requires some post-installation steps after completing SQL Server setup. Please follow the steps outlined in the link http://go.microsoft.com/fwlink/?LinkId=626645 (Ctrl-C to copy)

—————————

OK

—————————

The warning won’t stop the install, but take note of the URL. You’ll need it later. Click next to continue the installation. When you get to the Server Configuration page you’ll see a new SQL Server Launchpad service which will be used to integrate R with SQL.

image

One more thing. BOL says to enable Mixed Mode to let connections from client workstations run R scripts on the server.

After the install completes there’s a few more steps to enable R. First you need to enable the use of external scripts. In a query window run

EXEC sp_configure ‘  external scripts enabled’, 1;
RECONFIGURE

You should always configure memory on your instance, and that’s true if you’re running R. On my test server I’m using Microsoft’s recommendation of 80% of the total, or 6.4 GB out of 8 GB total.

Next you’ll need to run a script to register the R libraries and set up some needed accounts. Open a command prompt as an administrator and run the following command for a default instance:

%programfiles%\RRO\RRO-3.2.2-for-RRE-7.5.0\R-3.2.2\library\RevoScaleR\rxLibs\x64\RegisterRExt.exe /install

If you’re running on a named instance you’ll need to specify the instance name. Replace <instancename> with the name of your instance.

%programfiles%\RRO\RRO-3.2.2-for-RRE-7.5.0\R-3.2.2\library\RevoScaleR\rxLibs\x64\RegisterRExt.exe /install /instance:<instancename>

Now you need to download and install the R IDE from Revolution R. You’ll need Revolution R Open (RRO) and Revolution R Enterprise (RRE). You can also install them on a client workstation. There’s a different RRE file for server and client so pick the right one. The one with “node” in the file name is for the server and doesn’t include the IDE. Installing RRE will also load the rpart and lattice packages by default.

Finally you’ll need to set up an account that can run R scripts. Create a SQL login account, and in each database where you’ll be running R scripts add that account to the db_datareader role. In master add the account to the new db_rrerole.

Now for my little Hello World test

image

Now that I’ve got R set up, I can play around. Stay tuned.

Resources

SQL Server R Services
https://msdn.microsoft.com/en-us/library/mt604845.aspx

Installing SQL R Services
https://msdn.microsoft.com/library/mt604883(SQL.130).aspx

Install Advanced Analytics Extensions
https://msdn.microsoft.com/en-us/library/mt590808.aspx

Install R Packages and Providers for SQL Server R Services
https://msdn.microsoft.com/en-us/library/mt590809.aspx

Post-Installation Server Configuration (SQL Server R Services)
https://msdn.microsoft.com/en-us/library/mt590536.aspx

New option in SSRS Config Manager

I’m finally getting around to looking at SQL Server Reporting Services in SQL 2016 CTP 2.3. I want to check out some of the new features. I haven’t done much yet, except to install and configure SSRS. But I did notice there’s a new option in the SSRS configuration manager.

On the left, under Scale Out Deployment, there is now an option to configure an account to use if you have subscriptions that write reports to a file share. As the screen states, you should use and account that has only enough permissions to do its job, and it should not be the same account as the one running SSRS.

image

Road Trip to Western Wisconsin PASS next week

Next week I’m presenting at the Western WI PASS user group, on September 16th at 6 PM. My session will cover Administering and Troubleshooting SQL Server Reporting Service and I’ll be talking about configuring SSRS and how to monitor what’s happening on the SSRS server. The troubleshooting part will show how to find and eliminate the reason your reports begin to slow down. No SharePoint, though. I’ll only be covering a native instance. But much of what I’ll be going over will apply to both native and SP.

I’m excited to be coming back to the Eau Claire area. I used to spend a lot of time around Augusta in my younger days. I have lots of great memories of fishing on Lake Eau Claire as a kid, though the fish I caught back then were probably not a big as I remember now.

Come on and stop by if you’re anywhere near Altoona next Wednesday. The session starts at 6, and you can register through the link below. I’m looking forward to meeting everyone and sharing any SSRS stories, tips, and tricks you may have.

https://www.eventbrite.com/e/western-wisconsin-pass-september-chapter-meeting-tickets-17664023570

Moving a FileTable file from one directory to another with TSQL

I’m starting to catch up on some of my backlog this month. One of the items on my list has to do with FileTables. I wrote about inserting files into a SQL FileTable a while back. I gave an example how to do it with a TSQL script. In demos I showed how you could move files around through Windows Explorer. This post is how to do the same thing through another script.

I’m going to use the same basic environment I’ve set up in the past. To summarize I have a DemoDB database with a single FileTable called Documents.  There are 2 directories and a file in the root directory, and a sub directory and a small number of .jpg files in the Demotivators directory. You can get the idea from these screen shots.

image  image

Now we’ll go back to using a SQL script. We’ll run the following snippet to see what files are in what directory.

USE DemoDB;
GO

-- Get all files in each directory
;WITH FileDirectories AS
(
SELECT name, path_locator--, parent_path_locator
FROM dbo.Documents
WHERE is_directory = 1
)
SELECT CTE.name AS Directory, d.name AS FileName
FROM FileDirectories AS CTE
LEFT JOIN dbo.Documents AS d ON CTE.path_locator = d.parent_path_locator

The results match what we see in Explorer.

image

You can now see that the sub directory, TestFolder, in the Demotivators directory, has a file called Boffo.jpg. That’s not where we want it, we want it moved to the New Demotivators directory one level up.

We’ll pretty much follow the same steps we did when we inserted a file. We’ll declare a @path variable to hold the path_locator of the target folder, in this case New Demotivators. We’ll use that to build a new path. Then we just need to update the path_locator for the boffo.jpg file.

-- Let's move a file from one directory to another
DECLARE @path        HIERARCHYID
DECLARE @new_path    VARCHAR(675)

SELECT @path = path_locator
FROM dbo.Documents
WHERE name = 'New Demotivators'

SELECT @new_path = @path.ToString()     +
CONVERT(VARCHAR(20), CONVERT(BIGINT, SUBSTRING(CONVERT(BINARY(16), NEWID()), 1, 6))) + '.' +
CONVERT(VARCHAR(20), CONVERT(BIGINT, SUBSTRING(CONVERT(BINARY(16), NEWID()), 7, 6))) + '.' +
CONVERT(VARCHAR(20), CONVERT(BIGINT, SUBSTRING(CONVERT(BINARY(16), NEWID()), 13, 4))) + '/'

UPDATE dbo.Documents
SET path_locator = @new_path
WHERE name = 'Boffo.jpg'

The code I’m using to build the new path is the same code that the table constraint on the path_locator column uses.

-- Constraint on path_locator
ALTER TABLE [dbo].[Documents]
ADD  CONSTRAINT [DF__Documents__path___182C9B23]
DEFAULT (convert(hierarchyid, '/' +
convert(varchar(20), convert(bigint, substring(convert(binary(16), newid()), 1, 6))) + '.' +
convert(varchar(20), convert(bigint, substring(convert(binary(16), newid()), 7, 6))) + '.' +
convert(varchar(20), convert(bigint, substring(convert(binary(16), newid()), 13, 4))) + '/'))
FOR [path_locator]
GO

Now we’ll just verify that the file has indeed moved. You’ll see the same thing if you browse with Windows Explorer.

image

And that’s how simple it is. Let me know if you have a different method.

Installing SQL 2016 CTP2

So here I was a few days ago, minding my own business, when Microsoft announced that they were releasing SQL Serve 2016 CTP2 to the public. It takes me a while to get to know new versions, and it was shaping up to be a miserable spring weekend, weather wise. What else could I do but download and start installing it? Challenge accepted!

First things first. Here’s where to download the CTP from. It’s probably a good idea to also download and read the datasheet to see what the new features are going to be.

For this test I’m installing SQL 2016 on a Hyper-V virtual machine that I provisioned with a single CPU and 4 GB of memory, and it’s not part of a domain. It’s running Windows Server 2012 R2 with all updates as of May 30th, 2015. The host is my laptop running Windows 8.1 with 16 GB and a 1 TB SSD drive.

After mounting the .iso file I launched setup. The first screens haven’t changed, you’ll go through the same steps as earlier editions, going through the rules, checking for updates, etc. You won’t see anything different until you get to the screen where you choose what features you want to install.

image

Under Instance Features is the new PolyBase Query Service for External Data. PolyBase will allow you to query Hadoop or Azure blob storage from within SQL. It’s not really new as you could get it if you were running SQL on a Parallel Data Warehouse. You can read up on it here.

I chose to install all features.

image

If you look at the Prerequisites for selected features on the right you’ll notice that the 4.6.NET Framework is needed, for Management Tools. More on this later. Continuing the install I came to the Feature Rules screen, showing that I was missing Oracle JRE 7 Update 71.

image

I cancelled the install, downloaded and installed the file from Oracle, then restarted the SQL installation. I continued to set up the services, and since I chose to install PolyBase I’ll need to configure those as well.

image

Next up is the database engine. There’s an interesting change on the Data Directories tab. In addition to setting the default directories you can now specify how many files to use for tempdb. The recommendation is number of cores on the server, up to 8.

image

Since my virtual server only has a single core I left it at 1.

image

In SQL 2012 and 2014, if you wanted to install PowerPivot, you needed to install it in a separate instance. In 2016 it’s an option on the Analysis Services Configuration screen. I’m sticking with the Tabular Mode for this instance.

I completed the rest of the configuration screens and started the installation. Remember the 4.6 .NET Framework I mentioned above? It did cause a server reboot. However when my server came back up the installation didn’t resume. I started setup again, and this time it showed that only some of the shared components were installed and none of the instance features.

image

At this point I restored my virtual server and started from scratch. This time I pre-installed the Oracle component, then just installed all of the shared components. My server did not reboot after installing .NET 4.6 this time but I did manually, then installed the all of the instance features as the default instance.And no more hiccups.

I’m not sure what happened during the first install after the server restart, I neglected to check the logs afterwards. You may not see the same behavior. But this is a CTP after all, not the final release. I also didn’t pay attention to how long it took (this was on the weekend after all), my impression is that it was similar to other SQL installs I’ve done on this pc.

Now that it’s running I’m going to start poking around a bit.