Search

Categories

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Send mail to the author(s) E-mail

# Tuesday, 01 October 2013
( 70-464 )

Implement database objects (31%)

  • Create and alter tables (complex statements)
    • Develop an optimal strategy for using temporary objects (table variables and temporary tables); how not to rely on triggers solely as a means to manage a table; data version control and management; create tables without using the built-in tools; understand the difference between @Table and #table
  • Design, implement, and troubleshoot security

    • Grant, deny, revoke; unable to connect; execute as; certificates; loginless user; database roles and permissions; contained users; change permission chains

  • Design the locking granularity level

    • Choose the right lock mechanism for a given task, handling and/or avoiding deadlocks; fix locking and blocking issues caused by previous development or third-party apps; analyze a deadlock scenario to alleviate the issue; impact of isolation level and ado defaults; impact of locks and lock escalation; reduce locking scenarios; how isolation levels affect blocking and locking; identify bottlenecks in, and improve, the data design

  • Maintain indexes

    • Inspect physical characteristics of indexes and perform index maintenance; identify fragmented indexes; identify unused indexes; implement indexes; defrag/rebuild indexes; set up a maintenance strategy for indexes and statistics; optimize indexes (full, filter index); statistics (full, filter) force or fix queue; when to rebuild versus reorg and index; create a tuning and maintenance strategy for proactive operations

  • Implement data types

    • Use appropriate data types; develop a CLR data type; understand the difference between @Table and #table; impact of GUID (newid, newsequentialid) on database performance, indexing and privacy; use spatial data; LOB data types; understand when and how to use column store and sparse columns; implicit and explicit conversions, integer math

  • Create and modify constraints (complex statements)

    • Create constraints on tables; define constraints; performance implications

  • Work with XML data

    • Implement XML; use XML (Query, Input, Output); transform XML data into relational data; retrieve relational data as XML; FOR XML; design a strategy to transform XML into relational data; design a strategy to query and modify XML data; understand XML data types and their schemas and interoperability, limitations, and restrictions; implement XML schemas and handling of XML data; how to handle it in SQL Server and when and when not to use it, including XML namespaces; import and export XML

Implementing Programming Objects (21%)

  • Write automation scripts

    • Automate backup testing; shrink file; check index fragmentation; archive data; run an SQL Server Integration Services (SSIS) job; check disk space; automate backups

  • Design and implement stored procedures

    • Create stored procedures and other programmatic objects; techniques for developing stored procedures; different types of stored procedure results; create stored procedure for data access layer; analyze and rewrite procedures and processes; program stored procedures, with T-SQL and CLR#; use table valued parameters; encryption

  • Design T-SQL table-valued and scalar functions

    • Ensure code non regression by keeping consistent signature for procedure, views and function (interfaces); turn scripts that use cursors and loops into a SET based operation

  • Create, use, and alter user-defined functions (UDFs)

    • Understand deterministic, non-deterministic functions; use cross apply with UDFs; Common Language Runtime (CLR)

  • Create and alter views (complex statements)

    • Set up and configure partitioned tables and partitioned views; design a best practice for using views and stored procedures and remove the direct usage of tables

Design Database Objects (24%)

  • Design tables

    • Data design patterns; develop normalized and de-normalized SQL tables; understand the difference between physical tables, temp tables, temp table variables, and common table expressions; design transactions; design views; describe advantages / disadvantages of using a GUID as a clustered index; understand performance implications of # versus @ temp tables and how to decide which to use, when, and why; use of set-based rather than row-based logic; encryption (other than TDE); table partitioning; filestream and filetable

  • Design for concurrency

    • Develop a strategy to minimize concurrency; handle concurrency to minimize locking and eliminate as much blocking as possible, and to avoid deadlocks; manage the transactions to limit the time to hold lock and have fast transactions (maximize concurrency); define locking and concurrency strategy; impact of read committed snapshot / snapshot isolation; understand what it solves and what it costs

  • Create and alter indexes

    • Create indexes and data structures; create filtered indexes; create an indexing strategy; design and optimize indexes; design indexes and statistics; assess which indexes on a table are likely to be used given different search arguments (SARG); column store indexes; semantic indexes

  • Design data integrity

    • Design table data integrity policy (checks, private key/foreign key, uniqueness, XML schema); select a primary key; data usage patterns

  • Design for implicit and explicit transactions

    • Manage transactions; use transactions in code; ensure data integrity by using transactions; use transactions inside the database using T-SQL and from the "outside" via C#/VB; distributed transaction escalation

Optimize and troubleshoot queries (24%)

  • Optimize and tune queries

    • Tune a badly performing query; identify long running queries; review and optimize code; analyze execution plans to optimize queries; tune a query that is poorly written; tune queries using execution plans and database tuning advisor (DTA); design advanced queries: pivots, utilizing common table expressions (CTE), design the database layout and optimize queries (for speed and/or data size); understand different data types; basic knowledge of query hints; tune query workloads, using realistic data sets not being production data sets; demonstrate use of recursive CTE; full text search; control execution plans

  • Troubleshoot and resolve performance problems

    • Interpret performance monitor data; impact of recovery modal on database size, and recovery; how to clean up if .MDF and .LDF files get too large; identify and fix transactional replication problems; detect and resolve server hung, failure; identify and troubleshoot data access problems

  • Optimize indexing strategies

    • Develop optimal strategy for clustered indexes; analyze index usage; know the difference between the type of indexes and when to choose one over the other; optimize indexing for data warehousing vs. optimize indexing for Online Transaction Processing (OLTP); generate appropriate indexes and statistics with include columns; apply effective and efficient indexes, including the use of INCLUDE lists; full-text indexing

  • Capture and analyze execution plans

    • Collect and read execution plan; review an execution plan to spot potential performance issues; read an execution plan; create an index based on an execution plan; row-based logic vs. set-based logic, batching, splitting implicit transactions

  • Collect performance and system information

    • Use Data Management Views to determine performance issues; from system metadata; gather trace information by using the SQL Server Profiler; develop monitoring strategy for production database; run a Profiler trace and analyze the results; run Profiler for troubleshooting application; collect output from the Database Engine Tuning Advisor; extended events

| | # 
( 70-464 )

http://www.microsoft.com/learning/en-us/exam-70-464.aspx

http://www.networkworld.com/community/node/43253

I like his advice

  • Find objectives list
  • Do the labs
  • Take practice exams (no applicable as none available for 464)
  • Don’t study anything on the day!
  • Don’t worry about failing..just do again

Braindumps are a bad idea

http://borntolearn.mslearn.net/certification/database/w/wiki/523.464-developing-microsoft-sql-server-2012-databases.aspx#fbid=YbBt-gksaOL

| | # 
( 70-464 )

SQL Server Profiler

  • Monitor the perf on an instance
  • Debug TSQL statements and SP’s
  • Identify slow queries
  • Capture on production and replay on test
  • Audit and review activity that occurs

Terminology

  • Template
    • Defines the criteris
  • Event
    • an action
    • login, tsql command, start/end sp
  • Trace
    • data that is capture eg into trc file

image

image

Querying a Trace Table

image
Useful in testing to see expensive queries, users causing issues etc..

Database Engine Tuning Advisor

  • A tool that will analyze a db
  • It analyses db operations that are captured using SQL Server Profiler
  • What can it do
    • recommend best mix of indexes
    • partitioning
    • indexes views
    • tuning settings
    • drop structures no longer useful
    • generate reports about queries , workloads etc..
    • whatif hypothetical change results

Query Execution Plans

Table scan – down arrow… red flag!  Probably index

image
Art and Science.. reading and figuring out execution plans

Query Hints

  • SQL Server’s query optimizer selects the best execution plan for a query
  • Query hint..rare case..can insert a hint
  • The query optimizer takes these hints personally!
    • affect all the operators in statement

System Views

For us to use to get meta information.  In Adventuresworks

image

select * From sys.tables
select * From sys.databases
select * from sys.views

Performance Monitor

Monitor SQL Server in real time

  • transaction log
  • locking
  • average wait time
  • deadlocks

image
Look for SQLServer:  (not here on Windows8)

| | # 
( 70-464 )
  • Inadequate concurrency
    • lost updates
      • two transactions select same row and then update row
      • the last update overwrites the earlier update
      • earlier update becomes lost data
    • uncommitted dependency (Dirty Read)
      • Transaction A reads data that has been updated by Transaction B, but B has not been committed and rolls back.
    • inconsistent analysis (nonrepeatable read)
      • Transaction B reads data that is being updated by transaction A, but gets different values on several reads
    • phantom reads
      • Transaction A deletes row.. then B can’t see it
  • How can concurrency be allowed yet controleed
    • locking
      • preventing one transaction from workign with data until another transaction has finished working with the data

Normalization and Concurrency

The more locking the slower the performance perceived

  • 2 main ways of easing concurrency
    • changing transaction isolation level
    • normalization
      • ie spread out into multiple tables

A table is ‘wide’ if it has lots of columns.

Transaction Isolation Levels

  • Controls the level of transaction isolation
    • when locks are applied
    • how long the locks are maintained
  • 5 transaction isolation level
    • read uncommitted
      • wild west.  Allows statements to read rows that have been modified by other transactions but not yet committed
    • read committed
      • actions that have not been committed by other transactions cannot be read.. **Default
    • reapeatable read
      • no other transaction can make modification to data that has been read by the current transaction
    • snapshot
      • almost like a snapshot of data when I started the transaction.. ie nothing else would mess with data
    • serializable
      • nothing that could affect the current transaction is allowed even inserting new rows

SET TRANSACTION ISOLATION LEVEL

Monitoring and Management Locking

Locking is inevitable

  • Minimize locking
    • good programming techniques
    • database design
      • normalize
  • Use Activity Monitor to see locking
| | # 
# Monday, 30 September 2013
( 70-464 )
  • Primary Data File – every db has one.  .mdf extension
  • Secondary Data File – spread db across multiple disks.. ndf extension
  • Transaction Log File – .ldf extension

Disk intensive IO app

  • More physical disks improves IO
  • Faster physical disks improves IO
  • File location is critical to IO
  • Data files and Transaction Log files should be on separate physical disks

Filegroups

  • Contains SQL Server data files
  • Every Database has a Primary filegroup
  • Additional (user-defined) filegroups
    • allows file or objects to be group together for admin
    • allow files or objects to be places on physical disks
  • Most advantages are more easily obtained by using RAID

image

BLOBs

Binary Large Object

  • A collection of binary data stored as a single entity
    • Photo
    • Audio
    • Documents
  • DBs are designed to store ASCII characters
  • BLOBS contain non ASCII characters
  • BLOBS need to be stored bit by bit, and retried bit by by
    • Store within DB but > 1MB less efficient
    • Storage in file system

FILESTREAM Basics

  • Store unstructured data on the file system not in a table
  • Integrated SQL Server with NTFS
    • stores varbinary(max)
  • Uses NT system cache
  • Disabled by default when you install or upgrade
  • Requires a specific FILESTREAM filegroup (contains Data Containers. inner workings of filegroups for filestreams)

When use?

  • When binary object being stored > 1MB
  • When fast possible read access is important
  • <1MB is faster to store in DB

Using Filestream

  • filestream attibute must be set in TSQL

FileTable Type

New in 2012

  • New Table Type
  • Allows data to be stored in a db table
  • But can then see it from windows OS
    • Is exposed via a Windows Share
  • Every row represents a file for a directory
  • Windows side is non transactional
| | # 
( 70-464 )
  • Changing way data is stored in the table
  • Breaks logical storage into smaller pieces
    • done ‘horizontally’

30,000 for each month..but only really care about current month

So can partition based on month column

image

  • archival functionalities.. ie switch it out to another table (archive table.. data movement is very fast)
    • keeps production table small
  • maintenance – filegroups
  • 2 threads almost.. table partitioning.. query perf
  • use only on ‘large’ tables
| | # 
( 70-464 )

Avoid nullable columns? because this creates complexity in db ops

Should not contain repeating values or columns

names should be short and self-explanatory

first char must be a letter, _ or # (makes it temp table), ## global temp tables

table can contain up to 1024 columns

Temp Tables

Local – only visible in current session. #tablename

Global – visible to all sessions ##tablenames

  • To carry out complex logic in a sp
    • logic that cannot be done within single query
      • load results into temp table
      • run query on temp
    • great for reporting

Simple, no security issues,

speed – generally faster than normal tables..but, they are a physical write.. to tempDB

slower than table variables

Excessive use can create perf issues

Table Variables

is… a variable.

a special data type

use a DECLARE statement and @  (must be variable)

Exists only in memory (like RAM drive) so very fast

Automatically cleaned up at end of fn, sp etc..

Limitations

  • Operations on table variables are not supported in the optimizer
  • Can’t create indexes (can do trick, but not true indexes)
  • The more data you store, the more opportunities to lose efficiencty

Temp Table vs Table Variables

Temp Table

  • When you need to store a lot of data.. MS speak > 100 rows
  • When need complex actions

Table Variables

  • When you don’t need to store a lot.. MS speak < 100 rows
  • Do not need to perform complex actions
  • But when memory is an issue be careful as all in RAM

GUID’s

Globally Unique Identifier

16 byte.  128 bit identifier

All rows must be unique

NEWID()

NEWSEQUENTIALID() – always creates a higher one.. since last generated.. since last time server was restarted.

security issue – can guess number of next one.. but better for indexing

CREATE TABLE MarktTest
(
    ID uniqueidentifier DEFAULT NEWID(),
    City varchar(25),
    State char(2)
)

GUID’s created:

  • ID number of the network card
  • Unique number from the CPU clock
image
| | # 
( 70-464 )

image
Ctrl+A

image
Table View, Column Names

| | # 
# Wednesday, 25 September 2013
( 70-464 )

http://msftdbprodsamples.codeplex.com/releases

image
Scroll down to find to 1MB zipped up Lite database.

Attached the mdf

If get access violation (as no ldf)

copy mdf to here:

C:\Program Files\Microsoft SQL Server\MSSQL11.SQL2012\MSSQL\DATA

the remove the log file.  Then it will automatically create one.

| | # 
( 70-464 )

An alternative name for a db object

image

image

| | # 
( 70-464 )

image

James is in 2 states.

Can’t delete 3.  Orphaned data.

Transactions

MDF – Data, Tables, SP’s

LDF – Transaction log file.

Records all modifications

Transactions – group of modifications  either all succeed or all fail

Very useful for data recovery and backups.

Checkpoints.. write to the db

  • write to transact log
  • write to db in checkpoint

when rollback.. it gets deleted from transaction log

image

Refresh Intellisense

Or local cache.

image
If create table ML.  Then query, get red squiggly.

image
Ctrl+Shift+R

| | # 
( 70-464 )

image
Got PK’s

image
Splitting out

image

If lots of INSET UPDATE DELETE then want as many tables as possible to avoid locking

If lots of SELECT want less tables, for less joins

| | # 
# Wednesday, 18 September 2013
( 70-464 )

image

image

image

image

image

image

image

Review Questions
1. What is the function of a stopword?
2. What are iFilters used for?
3. What is the difference between FREETEXT and FREETEXTTABLE?
4. How do you configure a thesaurus for use with full-text indexing?


Best Practices
1. Create a stoplist for your company. Add to the stoplist, any words that are used in almost all your
company documents.
2. Use auto-population of indexes except in rare cases with specific issues. (These situations would
typically involve data that is updated at a high rate and where the index does not need to be kept
completely up to date.)
3. Try to encourage developers in your organization to offer much more flexible user interfaces to your
end users, based on full-text indexes in SQL Server.

| | # 
( 70-464 )

image

image

-- Demonstration 2A

-- Step 1 - Open a new query window to the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2 - Create a fulltext catalog

CREATE FULLTEXT CATALOG FTCatalog AS DEFAULT;
GO

-- Step 3 - Create a fulltext index

CREATE FULLTEXT INDEX ON Person.Contact(FirstName)
KEY INDEX PK_Contact_ContactID
WITH CHANGE_TRACKING AUTO;
GO

-- Step 4 - Check that the population has completed
--          by querying sys.fulltext_indexes

SELECT is_enabled, has_crawl_completed, crawl_type,
       crawl_start_date, crawl_end_date
FROM sys.fulltext_indexes;
GO
| | # 
( 70-464 )

image

image

image

-- Demonstration 1A

-- Step 1 - Open a new query window to the IFTS database

USE IFTS;
GO

-- Step 2 - Show the messages that have been indexed

SELECT * FROM dbo.Messages ORDER BY MessageID;
GO

-- Step 3 - Search for a specific word

SELECT MessageID,Description
FROM dbo.Messages
WHERE CONTAINS(Description,'file')
ORDER BY MessageID;

-- Step 4 - Search for proximity  note that without ranking
--          the results are not that useful

SELECT MessageID,Description
FROM dbo.Messages
WHERE CONTAINS(Description,'NEAR((file,read),10,FALSE)')
ORDER BY MessageID;

-- Step 5 - Search for prefix

SELECT MessageID,Description
FROM dbo.Messages
WHERE CONTAINS(Description,'"fi*"')
ORDER BY MessageID;

-- Step 6 - See if filing is there

SELECT MessageID,Description
FROM dbo.Messages
WHERE CONTAINS(Description,'"filing"')
ORDER BY MessageID;

-- Step 7 - Note that inflectional forms of file finds it

SELECT MessageID,Description
FROM dbo.Messages
WHERE CONTAINS(Description,'FORMSOF(INFLECTIONAL,file)')
ORDER BY MessageID;

-- Step 8 - Search for a sentence

SELECT MessageID,Description
FROM dbo.Messages
WHERE FREETEXT(Description,'statement was terminated')
ORDER BY MessageID;

-- Step 9 - Note how much more meaningful when ranked

SELECT m.MessageID,m.Description,ft.RANK
FROM dbo.Messages AS m
INNER JOIN
FREETEXTTABLE(dbo.Messages,Description,'statement was terminated') AS ft
ON m.MessageID = ft.[KEY]
ORDER BY ft.RANK DESC;

-- Step 10 - Show how weighting is also useful

SELECT m.MessageID,m.Description,ft.RANK
FROM dbo.Messages AS m
INNER JOIN
CONTAINSTABLE(dbo.Messages,Description,
'ISABOUT (certificate weight (.8), symmetric weight (.4),
          key weight (.2) )') AS ft
ON m.MessageID = ft.[KEY]
ORDER BY ft.RANK DESC;
| | # 
( 70-464 )

image

image

image

-- Demonstration 3A

-- Step 1 - Open a new query window to the AdventureWorks database

USE AdventureWorks;
GO

ALTER TABLE Person.Address ADD SpatialLocation geography;
GO

UPDATE Person.Address SET SpatialLocation =
  geography::STGeomFromText(CASE AddressID WHEN 1 THEN 'POINT (-74.05 40)'
                                           WHEN 2 THEN 'POINT (-40 40)'
                                           WHEN 3 THEN 'POINT (-73 40.5)'
                            END,4326);
GO


-- Step 2 - Which salesperson is closest to New York?  

DECLARE @NewYork geography;
SET @NewYork = geography::STGeomFromText('POINT (-74.007339 40.726966)',4326);

SELECT c.ContactID,
       c.FirstName,
       c.LastName,
       a.SpatialLocation.STDistance(@NewYork) / 1000 AS DistanceKms
FROM Person.Contact AS c
INNER JOIN Sales.SalesPerson AS sp
ON c.ContactID = sp.SalesPersonID
INNER JOIN HumanResources.EmployeeAddress AS ea
ON c.ContactID = ea.EmployeeID
INNER JOIN Person.Address AS a
ON ea.AddressID = a.AddressID
WHERE (a.SpatialLocation.STDistance(@NewYork) / 1000) < 500
ORDER BY DistanceKms;
GO

-- Step 3 - Which two salespeople live the closest together?

WITH SalesPersonLocation
AS ( SELECT c.ContactID,
            c.FirstName,
            c.LastName,
            a.SpatialLocation
     FROM Person.Contact AS c
     INNER JOIN Sales.SalesPerson AS sp
     ON c.ContactID = sp.SalesPersonID
     INNER JOIN HumanResources.EmployeeAddress AS ea
     ON sp.SalesPersonID = ea.EmployeeID
     INNER JOIN Person.Address AS a
     ON ea.AddressID = a.AddressID
   )
SELECT TOP(1)
        sp1.FirstName,
        sp1.LastName,
        sp2.FirstName,
        sp2.LastName,
        sp1.SpatialLocation.STDistance(sp2.SpatialLocation) / 1000
          AS DistanceKms
FROM SalesPersonLocation AS sp1
CROSS JOIN SalesPersonLocation AS sp2
WHERE sp1.ContactID <> sp2.ContactID
ORDER BY DistanceKms;

GO

image

Review Questions
1. What is the main difference between the geometry and geography data types?
2. Why does the order of points matter when defining a polygon?


Best Practices
1. Set the SRID for geometry objects to 0 to ensure that operations on multiple geometry objects can
always be performed.
2. Use a CHECK CONSTRAINT to ensure that the SRID values for a column are consistent across all rows.
3. Before creating spatial indexes, make sure that the queries that need to be executed against the data
use predicate forms that are supported by the types of index you are creating.

 

image

| | # 
( 70-464 )

image

image

image

 

image

-- Demonstration 2A

-- Step 1 - Open a new query window to the tempdb database

USE tempdb;
GO

-- Step 2 - Draw a shape using geometry

DECLARE @Shape geometry;
SET @Shape = geometry::STGeomFromText('POLYGON ((10 10, 25 15, 35 15, 40 10, 10 10))',0);
SELECT @Shape;
GO

-- Step 3 - Draw two shapes

DECLARE @Shape1 geometry;
DECLARE @Shape2 geometry;
SET @Shape1 = geometry::STGeomFromText('POLYGON ((10 10, 25 15, 35 15, 40 10, 10 10))',0);
SET @Shape2 = geometry::STGeomFromText('POLYGON ((10 10, 25 5, 35 5, 40 10, 10 10))',0);
SELECT @Shape1
UNION ALL
SELECT @Shape2;
GO

-- Step 4 - Show what happens if you perform a UNION rather than a UNION ALL. This will fail as spatial types are not comparable.

DECLARE @Shape1 geometry;
DECLARE @Shape2 geometry;
SET @Shape1 = geometry::STGeomFromText('POLYGON ((10 10, 25 15, 35 15, 40 10, 10 10))',0);
SET @Shape2 = geometry::STGeomFromText('POLYGON ((10 10, 25 5, 35 5, 40 10, 10 10))',0);
SELECT @Shape1
UNION
SELECT @Shape2;
GO

-- Step 5 - Join the two shapes together

DECLARE @Shape1 geometry;
DECLARE @Shape2 geometry;
SET @Shape1 = geometry::STGeomFromText('POLYGON ((10 10, 25 15, 35 15, 40 10, 10 10))',0);
SET @Shape2 = geometry::STGeomFromText('POLYGON ((10 10, 25 5, 35 5, 40 10, 10 10))',0);
SELECT @Shape1.STUnion(@Shape2);
GO

-- Step 6 - How far is it from New York to Los Angeles?

DECLARE @NewYork geography;
DECLARE @LosAngeles geography;
SET @NewYork = geography::STGeomFromText('POINT (-74.007339 40.726966)',4326);
SET @LosAngeles = geography::STGeomFromText('POINT (-118.24585 34.083375)',4326);
SELECT @NewYork.STDistance(@LosAngeles);
GO

-- Step 7 - Draw the Pentagon

DECLARE @Pentagon geography;
SET @Pentagon = geography::STPolyFromText(
  'POLYGON(( -77.0532219483429 38.870863029297695,
             -77.05468297004701 38.87304314667469,
             -77.05788016319276 38.872800914712734,
             -77.05849170684814 38.870219840133124,
             -77.05556273460198 38.8690670969195,
             -77.0532219483429 38.870863029297695),
           ( -77.05582022666931 38.8702866652523,
             -77.0569360256195 38.870734733163644,
             -77.05673214773439 38.87170668418343,
             -77.0554769039154 38.871848684516294,
             -77.05491900444031 38.87097997215688,
             -77.05582022666931 38.8702866652523))',
           4326);
SELECT @Pentagon;
GO

-- Step 8 - Call the ToString method and notice the use of
--          the Z and M values that are stored but not processed

DECLARE @Point geometry;
SET @Point = geometry::STPointFromText('POINT(10 20 15 5)', 0);
SELECT @Point.ToString()
GO

-- Step 9 - Use GML for input

DECLARE @Point geography;
SET @Point = geography::GeomFromGml('
  <Point xmlns="http://www.opengis.net/gml">
      <pos>12 50</pos>
  </Point>',4326);
SELECT @Point;
GO

-- Step 10 - Output GML from a location (start and end points
--           of the Panama Canal only  not the full shape)

DECLARE @PanamaCanal geography;
SET @PanamaCanal
  = geography::STLineFromText('LINESTRING( -79.909 9.339, -79.536 8.942 )',4326);
SELECT @PanamaCanal,@PanamaCanal.AsGml();
GO

-- Step 11 - Show how collections can include different types of objects

DECLARE @ShapeCollection geometry;
SET @ShapeCollection = geometry::STGeomCollFromText(
   'GEOMETRYCOLLECTION( POLYGON((15 15, 10 15, 10 10, 15 15)),
                        POINT(10 10))',0);
SELECT @ShapeCollection;
GO

image

image

image

| | # 
( 70-464 )

image

image

image

Key Points
OPENXML provides a rowset over in-memory XML documents, which is similar to a table or a view.
OPENXML allows access to the XML data as though it is a relational rowset. It does this by providing a
rowset view of the internal representation of an XML document.

Review Questions
1. What are AUTO mode queries?
2. What are PATH mode queries?
3. What does the nodes() method do?
4. What are RAW mode queries?


Best Practices
1. Convert existing code that uses the nvarchar data type for XML parameters to use the XML data type.
2. Provide meaningful row names when using RAW mode by using the optional name parameter to the
RAW clause.
3. Check the query plans for queries using the nodes() method to ensure that the lack of cardinality
estimates is not producing a poor execution plan.

| | # 
( 70-464 )

image

Best Practices
1. Use appropriate data types for your database columns. Do not store all your data in XML columns.
2. Use XML schemas only when required. Validating data against schemas incurs substantial processing
overhead.
3. Ensure you have at least basic XML proficiency when working with SQL Server, even if you will be
working primarily in database administration.
4. Index XML data stored in database columns. Use the appropriate type of index for the types of
queries expected.

| | # 
( 70-464 )

image

image

image

-- Demonstration 3A

-- Step 1 - Open a new query window against tempdb

USE tempdb;
GO

-- Step 2 - Create a primary XML index

CREATE PRIMARY XML INDEX IX_ProductImport_ProductDetails
ON dbo.ProductImport (ProductDetails);
GO

-- Step 3 - Create a secondary VALUE index

CREATE XML INDEX IX_ProductImport_ProductDetails_Value
ON dbo.ProductImport (ProductDetails)
USING XML INDEX IX_ProductImport_ProductDetails
FOR VALUE;
GO

-- Step 4 - Query the sys.xml_indexes system view

SELECT * FROM sys.xml_indexes;
GO

-- Step 5 - Drop and recreate the table without a primary key

DROP TABLE dbo.ProductImport;
GO

CREATE TABLE dbo.ProductImport
( ProductImportID int IDENTITY(1,1),
  ProductDetails xml (CONTENT dbo.ProductDetailsSchema)
);
GO

-- Step 6 - Now try to re-add the primary xml index

CREATE PRIMARY XML INDEX IX_ProductImport_ProductDetails
ON dbo.ProductImport (ProductDetails);
GO

-- Step 7 - Note that not only is a primary key required, a clustered primary key is required
| | # 
( 70-464 )

 

image

image

 

image


-- Step 10 - Note how an XML schema (often called an XSD schema) can
--           be included within an XML document to make the document
--           even more self-describing

image

image

image

| | # 
( 70-464 )

image

image

image

using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using System.Text;
using Microsoft.SqlServer.Server;

public partial class UserDefinedFunctions
{
    [SqlFunction(DataAccess=DataAccessKind.None)]
    public static SqlString FormatPhoneNumber(SqlString PhoneNumberToFormat)
    {
        SqlString returnSqlString = new SqlString();

        if (PhoneNumberToFormat.IsNull)
            return returnSqlString;

        StringBuilder numbersOnly = new StringBuilder("");

        for (int counter = 0; counter < PhoneNumberToFormat.Value.Length; counter++)
        {
            if ((PhoneNumberToFormat.Value[counter] >= '0')
                && (PhoneNumberToFormat.Value[counter] <= '9'))
                numbersOnly.Append(PhoneNumberToFormat.Value[counter]);
        }
        
        string numbers = numbersOnly.ToString();
        string returnString = "";

        switch (numbers.Length)
        {
            case 6:
                returnString = numbers.Substring(0,3) + "-" + numbers.Substring(3,3);
                break;
            case 7:
                returnString = numbers.Substring(0,3) + "-" + numbers.Substring(3,4);
                break;
            case 8:
                returnString = numbers.Substring(0,4) + "-" + numbers.Substring(4,4);
                break;
            case 10:
                returnString = "(" + numbers.Substring(0,3) + ") "
                             + numbers.Substring(3,3) + "-" + numbers.Substring(6,4);
                break;
        }

        return new SqlString(returnString);
    }
};

String manipulation in CLR/C# better than SQL

image

-- Demonstration 3B - Stored Procedures and Triggers

-- Step 1: Open a new query window
--         and use the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2: Catalog the stored procedure dbo.ProductsByColor

CREATE PROCEDURE dbo.ProductsByColor
(@Color nvarchar(16))
AS EXTERNAL NAME
SQLCLR10776A.StoredProcedures.ProductsByColor;
GO

-- Step 3: Test the stored procedure
--         (make sure you also look on the messages tab in the
--         output as the stored procedure also sends the
--         equivalent of a PRINT statement there)

EXEC dbo.ProductsByColor NULL;
EXEC dbo.ProductsByColor 'Blue';
GO

-- Step 4: Catalog the stored procedure dbo.WriteOSFile
--         Note: this procedure will perform external access

CREATE PROCEDURE dbo.WriteOSFile
(@OutputFileName nvarchar(400), @FileData varbinary(max))
AS EXTERNAL NAME SQLCLR10776A.StoredProcedures.WriteOSFile;
GO

-- Step 5: Test the dbo.WriteOSFile stored procedure

DECLARE @XmlData varbinary(max);
SET @XmlData = CAST(
N'<CustomerBalances>
    <Customer CustomerID="12" Balance="234.24" />
    <Customer CustomerID="14" Balance="100.00" />
</CustomerBalances>'
AS varbinary(max));

EXEC dbo.WriteOSFile 'D:\10776A_Labs\10776A_16_PRJ\CustomerBalances.xml',
                     @XmlData;
GO

-- Step 6: Open the file that was written using notepad
--         and view the results

-- Step 7: Create and populate a test table

CREATE TABLE dbo.TestTable
( TestTableID int IDENTITY(1,1) PRIMARY KEY,
  TestName nvarchar(20)
);
GO
INSERT INTO dbo.TestTable (TestName)
  VALUES ('Hello'),('There');
GO

-- Step 8: Catalog a trigger on the table that fires
--         when it is updated

CREATE TRIGGER TR_TestTableUpdate
ON dbo.TestTable
AFTER UPDATE
AS EXTERNAL NAME SQLCLR10776A.Triggers.TestTableUpdate;
GO

-- Step 9: Test the trigger  it simply sends a message
--         to the messages tab when the trigger fires

UPDATE dbo.TestTable
SET TestName = 'Goodbye'
WHERE TestTableID = 1;
GO

-- Step 10: Drop the test table

DROP TABLE dbo.TestTable;
GO

image

eg MEDIAN aggregate

image

The geometry, geography, and hierarchyid system data types are in fact system CLR data types.

image

Review Questions
1. Which types of database objects can be implemented using managed code?
2. What purpose do attributes have in CLR managed code?

Best Practices
1. The biggest mistake made when deciding between T-SQL and Managed Code is to assume that either
one is the correct answer for every situation. Each has benefits and limitations and should be used for
the appropriate tasks.
2. Developers should avoid implementing using SQL CLR to implement code that would be better
placed on another application tier (such as on a client system).
3. DBAs should avoid refusing to allow SQL CLR code without consideration. As you have seen in this
module, there is code that should be implemented in Managed Code rather than in T-SQL.
4. DBAs should set boundaries for developers:
• No row-based code that should be set-based T-SQL operations.
• Limited use of EXTERNAL_ACCESS permissions and only after justification.
• Rare use of UNSAFE permissions and only after very serious justifications and testing.

| | # 
( 70-464 )

image

image

image

Loaded the external assembly.  I had to change the owner of the db to be SA as I didn’t have the DC running.

| | # 
( 70-464 )

INSTEAD OF triggers

-- Demonstration 3A - INSTEAD OF Triggers

-- Step 1: Open a new query window
--         and use the tempdb database

USE tempdb;
GO

-- Step 2: Create and populate the dbo.CurrentPrice table
--         Note the additional IsValid column

CREATE TABLE dbo.CurrentPrice
(
    CurrentPriceID int IDENTITY(1,1)
      CONSTRAINT PK_CurrentPrice PRIMARY KEY,
    SellingPrice decimal(18,2) NOT NULL,
    LastModified datetime2 NOT NULL
      CONSTRAINT DF_CurrentPrice_LastModified
      DEFAULT (SYSDATETIME()),
    ModifiedBy sysname NOT NULL
      CONSTRAINT DF_CurrentPrice_ModifiedBy
      DEFAULT (ORIGINAL_LOGIN()),
    IsValid bit NOT NULL
      CONSTRAINT DF_CurrentPrice_IsValid
      DEFAULT (1)
);
GO

INSERT INTO dbo.CurrentPrice
  (SellingPrice)
  VALUES (2.3), (4.3), (5);
GO

SELECT * FROM dbo.CurrentPrice;
GO

-- Step 3: Create the trigger for INSTEAD OF DELETE

CREATE TRIGGER TR_CurrentPrice_Delete
ON dbo.CurrentPrice
INSTEAD OF DELETE AS BEGIN
  SET NOCOUNT ON;
  UPDATE cp
  SET cp.IsValid = 0
  FROM dbo.CurrentPrice AS cp
  INNER JOIN deleted AS d
  ON cp.CurrentPriceID = d.CurrentPriceID;
END;
GO

-- Step 4: Try to delete a row
--         Note the number of rows shown as being affected
--         and that no errors are returned

DELETE dbo.CurrentPrice
WHERE CurrentPriceID = 2;
GO

-- Step 5: Requery the table and note that row 2 is still
--         there but now modified instead of deleted

SELECT * FROM dbo.CurrentPrice;
GO

-- Step 6: Query sys.triggers and note the value in
--         the is_instead_of_trigger column

SELECT * FROM sys.triggers;
GO

-- Step 7: Drop the table (and with it, the trigger)

DROP TABLE dbo.CurrentPrice;
GO

-- Step 8: Create another table with two string columns

CREATE TABLE dbo.PostalCode
( CustomerID int PRIMARY KEY,
  PostalCode nvarchar(5) NOT NULL,
  PostalSubCode nvarchar(5) NULL
);
GO

-- Step 9: Create a view over the table that concatenates the string columns

CREATE VIEW dbo.PostalRegion
AS
SELECT CustomerID,
       PostalCode + COALESCE('-' + PostalSubCode,'') AS PostalRegion
FROM dbo.PostalCode;
GO

-- Step 10: Insert some data to the base table

INSERT dbo.PostalCode (CustomerID,PostalCode,PostalSubCode)
VALUES (1,'23422','234'),
       (2,'23523',NULL),
       (3,'08022','523');
GO
       
-- Step 11: Query the view to see the results

SELECT * FROM dbo.PostalRegion;
GO

-- Step 12: Try to insert into the view (will fail - note the error)

INSERT INTO dbo.PostalRegion (CustomerID,PostalRegion)
VALUES (4,'09232-432');
GO

-- Step 13: Try to update the view (will fail - note the error)

UPDATE dbo.PostalRegion SET PostalRegion = '23234-523' WHERE CustomerID = 3;
GO

-- Step 14: Try to delete a row

DELETE FROM dbo.PostalRegion WHERE CustomerID = 3;
GO

-- Question: Why does the DELETE succeed when INSERT and UPDATE fail?

-- Step 15: Create an INSTEAD OF INSERT trigger

CREATE TRIGGER TR_PostalRegion_Insert
ON dbo.PostalRegion
INSTEAD OF INSERT
AS
INSERT INTO dbo.PostalCode
SELECT CustomerID,
       SUBSTRING(PostalRegion,1,5),
       CASE WHEN SUBSTRING(PostalRegion,7,5) <> '' THEN SUBSTRING(PostalRegion,7,5) END
FROM inserted;
GO

-- Step 16: Try to insert into the view again

INSERT INTO dbo.PostalRegion (CustomerID,PostalRegion)
VALUES (4,'09232-432');
GO

-- Step 17: Note that two row counts have been returned

-- Step 18: Alter the trigger to remove the extra rowset

ALTER TRIGGER TR_PostalRegion_Insert
ON dbo.PostalRegion
INSTEAD OF INSERT
AS
SET NOCOUNT ON;
INSERT INTO dbo.PostalCode
SELECT CustomerID,
       SUBSTRING(PostalRegion,1,5),
       CASE WHEN SUBSTRING(PostalRegion,7,5) <> '' THEN SUBSTRING(PostalRegion,7,5) END
FROM inserted;
GO

-- Step 19: Try to insert into the view again

INSERT INTO dbo.PostalRegion (CustomerID,PostalRegion)
VALUES (5,'92232-142');
GO

-- Step 20: Note that only the correct rowcount is returned now

-- Step 21: Make sure the trigger works for multi-row inserts

INSERT INTO dbo.PostalRegion (CustomerID,PostalRegion)
VALUES (6,'11111-111'),
       (7,'99999-999');
GO

SELECT * FROM dbo.PostalRegion;
GO

-- Step 22: Drop the view and the table

DROP VIEW dbo.PostalRegion;
GO
DROP TABLE dbo.PostalCode;
GO

image

-- Demonstration 3B - Replacing Triggers with Computed Columns

-- Step 1: Open a new query window
--         and use the tempdb database

USE tempdb;
GO

-- Step 2: Create the dbo.SellingPrice table

CREATE TABLE dbo.SellingPrice
(
    SellingPriceID int IDENTITY(1,1)
      CONSTRAINT PK_SellingPrice PRIMARY KEY,
    SubTotal decimal(18,2) NOT NULL,
    TaxAmount decimal(18,2) NOT NULL,
    FreightAmount decimal(18,2) NOT NULL,
    ExtendedAmount decimal(18,2) NULL
);
GO

-- Step 3: Create a trigger to maintain the ExtendedAmount column

CREATE TRIGGER TR_SellingPrice_InsertUpdate
ON dbo.SellingPrice
AFTER INSERT, UPDATE AS BEGIN
  SET NOCOUNT ON;
  UPDATE sp
  SET sp.ExtendedAmount = sp.SubTotal
                        + sp.TaxAmount
                        + sp.FreightAmount
  FROM dbo.SellingPrice AS sp
  INNER JOIN inserted AS i
  ON sp.SellingPriceID = i.SellingPriceId;
END;
GO

-- Step 4: Test the trigger by inserting some rows
--         and selecting the values inserted

INSERT INTO dbo.SellingPrice
  (SubTotal, TaxAmount, FreightAmount)
  VALUES (12.3, 1.23, 10), (5, 1, 2);
GO

SELECT * FROM dbo.SellingPrice;
GO

-- Step 5: Drop the table and recreate it with a computed column

DROP TABLE dbo.SellingPrice;
GO

CREATE TABLE dbo.SellingPrice
(
    SellingPriceID int IDENTITY(1,1)
      CONSTRAINT PK_SellingPrice PRIMARY KEY,
    SubTotal decimal(18,2) NOT NULL,
    TaxAmount decimal(18,2) NOT NULL,
    FreightAmount decimal(18,2) NOT NULL,
    ExtendedAmount AS (SubTotal + TaxAmount + FreightAmount) PERSISTED
);
GO

-- Step 6: Reinsert the data to ensure the behavior is maintained
--         Note that it would now be more efficient

INSERT INTO dbo.SellingPrice
  (SubTotal, TaxAmount, FreightAmount)
  VALUES (12.3, 1.23, 10), (5, 1, 2);
GO

SELECT * FROM dbo.SellingPrice;
GO

-- Step 7: Drop the table

DROP TABLE dbo.SellingPrice;
GO

Computer column preferable to a Trigger.

Best Practices
1. In many business scenarios, it makes sense to mark records as deleted with a status column and use a
trigger or stored procedure to update an audit trail table. The changes can then be audited, the data
is not lost, and the IT staff can perform purges or archival of the deleted records.
2. Avoid using triggers in situations where constraints could be used instead.

| | # 
( 70-464 )
-- Demonstration 2A - AFTER INSERT Triggers

-- Step 1: Open a new query window
--         and use the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2: Create an INSERT trigger

CREATE TRIGGER TR_SalesOrderHeader_Insert
ON Sales.SalesOrderHeader
AFTER INSERT AS BEGIN
  IF EXISTS( SELECT 1
             FROM inserted AS i
             WHERE i.SubTotal > 10000
             AND i.PurchaseOrderNumber IS NULL
           ) BEGIN
    PRINT 'Orders above 10000 must have PO numbers';
    ROLLBACK;           
  END;
END;
GO

-- Step 3: Execute the following code to test the trigger
--         (first try to insert with a value more than 10000
--         but with a PO number)

INSERT INTO Sales.SalesOrderHeader
  (RevisionNumber, OrderDate, DueDate, Status,
   OnlineOrderFlag, PurchaseOrderNumber,
   CustomerID, ContactID, BillToAddressID, ShipToAddressID,
   ShipMethodID, SubTotal, TaxAmt, Freight)
  VALUES (1, SYSDATETIME(), SYSDATETIME(), 1,
          1, 'ABC-123',
          4, 1, 3, 3,
          1, 10502, 12, 100);
GO

-- Step 4: Try to insert with a value less than 10000
--         and without a PO number (will work)

INSERT INTO Sales.SalesOrderHeader
  (RevisionNumber, OrderDate, DueDate, Status,
   OnlineOrderFlag, PurchaseOrderNumber,
   CustomerID, ContactID, BillToAddressID, ShipToAddressID,
   ShipMethodID, SubTotal, TaxAmt, Freight)
  VALUES (1, SYSDATETIME(), SYSDATETIME(), 1,
          1, NULL,
          4, 1, 3, 3,
          1, 8000, 12, 100);
GO

-- Step 5: Try to insert without a purchase order number
--         but with a value more than 10000 (will fail)

INSERT INTO Sales.SalesOrderHeader
  (RevisionNumber, OrderDate, DueDate, Status,
   OnlineOrderFlag, PurchaseOrderNumber,
   CustomerID, ContactID, BillToAddressID, ShipToAddressID,
   ShipMethodID, SubTotal, TaxAmt, Freight)
  VALUES (1, SYSDATETIME(), SYSDATETIME(), 1,
          1, NULL,
          4, 1, 3, 3,
          1, 18000, 12, 100);
GO

-- Step 6: Drop the trigger and remove the rows added

DROP TRIGGER Sales.TR_SalesOrderHeader_Insert;
GO

DELETE soh
FROM Sales.SalesOrderHeader AS soh
WHERE NOT EXISTS (SELECT 1
                  FROM Sales.SalesOrderDetail AS sod
                  WHERE soh.SalesOrderID = sod.SalesOrderID);
GO

AFTER INSERT Putting in business logic in the Trigger.

-- Demonstration 2B - AFTER DELETE Triggers

-- Step 1: Open a new query window
--         and use the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2: Show the contents of the Sales.SalesTerritoryHistory table
--         (highlighting the nullable EndDate column)

SELECT * FROM Sales.SalesTerritoryHistory;
GO

-- Step 3: Create a trigger to disallow deletion of active history rows

CREATE TRIGGER TR_SalesTerritoryHistory_Delete
ON Sales.SalesTerritoryHistory
AFTER DELETE AS BEGIN
  IF EXISTS(SELECT 1
            FROM deleted AS d
            WHERE d.EndDate IS NULL) BEGIN
    PRINT 'Current Sales Territory History rows cannot be deleted';
    ROLLBACK;
  END;
END;
GO

-- Step 4: Test the trigger by attempting to delete a row
--         (the delete will fail)

SELECT * FROM Sales.SalesTerritoryHistory WHERE SalesPersonID = 283;
GO

DELETE FROM Sales.SalesTerritoryHistory
WHERE SalesPersonID = 283;
GO

-- Step 5: Drop the trigger
DROP TRIGGER Sales.TR_SalesTerritoryHistory_Delete;
GO

AFTER DELETE trigger – business logic

AFTER UPDATE exists too.

| | # 
( 70-464 )

image

DDL is Data Definition Language

Key Points
A DML trigger is a special kind of stored procedure that executes when an INSERT, UPDATE, or DELETE
statement modifies the data in a specified table or view. This includes any INSERT, UPDATE, or DELETE
statement that form part of a MERGE statement. A trigger can query other tables and can include
complex Transact-SQL statements.


DDL triggers are similar to DML triggers but DDL triggers fire when DDL events occur. DDL events occur
for most CREATE, ALTER or DROP statements in the T-SQL language.
Logon triggers are a special form of trigger that fire when a new session is established. There is no
concept of a Logoff trigger at present.

 

image

image

image

image

| | # 
( 70-464 )

image

image

-- Demonstration 3A - Deadlock retry - first Window

-- Step 1: Start SQL Profiler

-- Step 2: Create and start a new trace with only the Deadlock Graph event

-- Step 3: Execute the deadlock retry code in this window
--         and then immediately execute the code in the second window

-- Step 4: Note that both windows complete without apparent error

-- Step 5: Stop the profiler trace and view the graph

USE Deadlock;
GO

DECLARE @RetryCount int = 5;

WHILE (@RetryCount > 0) -- retry update if deadlock victim
BEGIN
  BEGIN TRY
    BEGIN TRANSACTION;
      UPDATE dbo.Product SET ProductName = 'Prod 1 Modified 2'
        WHERE ProductID = 1;

      WAITFOR DELAY '00:00:10';

      UPDATE dbo.Product SET ProductName = 'Prod 2 Modified 2'
        WHERE ProductID = 2;
      SET @RetryCount = 0;
    COMMIT TRANSACTION;
  END TRY
  BEGIN CATCH
    IF (ERROR_NUMBER() = 1205) -- deadlock victim
       BEGIN
         SET @RetryCount = @RetryCount - 1;
         WAITFOR DELAY '00:00:02';
         PRINT 'Warning: Deadlock occurred';
       END
    ELSE
       SET @RetryCount = -1;
    IF XACT_STATE() <> 0
      ROLLBACK TRANSACTION;
  END CATCH;
END;

USE master;
GO

Using deadlock retrying

Best Practices
When designing client-side database access code, do not assume that database operations will always
occur without error. Instead of a pattern like:
• Start a transaction
• Do some work
• Commit the transaction


Consider instead a pattern like:
• Reset the retry count.
• While the transaction is not committed and the retry count is not exhausted, attempt to perform the
work and commit the transaction.
• If an error occurs and it is an error that retries could apply to, retry. Otherwise, return the error to the
calling code.

| | # 
( 70-464 )

image

BEGIN TRAN;
DELETE Production.Product WHERE ProductID = 1;
PRINT 'Hello';
COMMIT;
PRINT 'Hello again';

image

SET XACT_ABORT ON;
BEGIN TRAN;
DELETE Production.Product WHERE ProductID = 1;
PRINT 'Hello';
COMMIT;
PRINT 'Hello again';

GO
PRINT 'Dave'

image

SET XACT_ABORT ON
The SET XACT_ABORT ON statement is used to tell SQL Server that statement terminating errors should
become batch terminating errors.

image

image

image

-- Demonstration 2A (SQL Server Error Handling)

-- Step 1: Open a new query window to tempdb

USE tempdb;
GO

-- Step 2: Use RAISERROR to raise an error of severity 10.
--         Note that in the results pane it is not displayed
--         as an error

DECLARE @DatabaseID int = DB_ID();
DECLARE @DatabaseName sysname = DB_NAME();

RAISERROR
    (N'The current database ID is:%d, the database name is: %s.',
    10, -- Severity.
    1, -- State.
    @DatabaseID, -- First substitution argument.
    @DatabaseName); -- Second substitution argument.
GO

-- Step 3: See the difference when the severity is 16. It appears
--         as an error.

DECLARE @DatabaseID int = DB_ID();
DECLARE @DatabaseName sysname = DB_NAME();

RAISERROR
    (N'The current database ID is:%d, the database name is: %s.',
    16, -- Severity.
    1, -- State.
    @DatabaseID, -- First substitution argument.
    @DatabaseName); -- Second substitution argument.
GO

-- Step 4: Add a new custom error message

EXECUTE sp_addmessage 61485, 16,
        'Houston, we have a problem in DatabaseID: %d, Database Name: %s';
GO

-- Step 5: Raise the new custom error message

DECLARE @DatabaseID int = DB_ID();
DECLARE @DatabaseName sysname = DB_NAME();

RAISERROR (61485,16,1,@DatabaseID,@DatabaseName);
GO

-- Step 6: Open the Demonstration 2A - second window
--         and try to add the same error number
--USE tempdb;
--GO

--EXECUTE sp_addmessage 61485, 16,
--        'Current DatabaseID: %d, Database Name: %s';
--GO
-- Step 7: Return here after executing in the second window

-- Step 8: Execute the following code using @@ERROR
--         Note how the @@ERROR value is quickly cleared

RAISERROR(N'Message', 16, 1);
IF @@ERROR <> 0
  PRINT 'Error=' + CAST(@@ERROR AS VARCHAR(8));
GO

-- Step 9: Execute code to capture the @@ERROR value

DECLARE @ErrorValue int;
RAISERROR(N'Message', 16, 1);
SET @ErrorValue = @@ERROR;
IF @ErrorValue <> 0
  PRINT 'Error=' + CAST(@ErrorValue AS VARCHAR(8));
GO

-- Step 10: Try to raise a system error

RAISERROR (823, 19, 1);
GO

-- Step 11: Use THROW to raise an error

THROW 51245, 'Not Happy Now', 1;
GO

Various ways around errors

| | # 
( 70-464 )
  • Syntax errors eg SELECT TOP(10) FROM Production.Product…. nothing is selected
  • Object resolution eg SELECT * from NonExistantTable
  • Statement termination errors
    • DELETE FROM Products WHERE ProdutID=1
    • PRINT ‘Hello’

image

image

| | # 
( 70-464 )

image

Best Practices
1. Always use the lowest transaction isolation level possible to avoid blocking and to avoid the chance
of deadlocks.
2. Many Microsoft-supplied components default to Serializable transactional isolation level but do not
need to be run at that level. Common examples are Component Services and BizTalk adapters.
3. Before spending too much time investigating blocking issues, make sure that all the queries that are
involved are executing quickly. This usually involves making sure that appropriate indexes are in
place. Often when query performance issues are resolved, blocking issues disappear.

| | # 
( 70-464 )

image

Key Points
Applications might need to wait some time for locks held by other applications to be released. A key
decision is how long an application should wait for a lock to be released.


Locking Timeout
The length of time that it is reasonable to wait for a lock to be released is totally dependent upon the
design requirements of the application. By default, SQL Server will wait forever for a lock.

image

image

| | # 
( 70-464 )

image

  • Atomic – all steps must succeed or none
  • Consistent – DB has to be in a consistent state after each step, not just after whole transaction
  • Isolation – must be isolated from other transactions
  • Durable – changes must be permanent

Transaction log – db writes to log on disk first, then db pages later

image

-- Demonstration 1A

-- Step 1 - Open a query window to the tempdb database

USE tempdb;
GO

drop table dbo.TestTable
-- Step 2 - Create and populate a test table

CREATE TABLE dbo.TestTable
( TestTableID int IDENTITY(1,1) PRIMARY KEY,
  TestName nvarchar(20)
);
GO

INSERT INTO dbo.TestTable
  (TestName)
  VALUES ('Row 1'),('Row 2'),('Row 3'),('Row 4');
GO

SELECT * FROM dbo.TestTable;
GO

-- Step 3 - Update the test table

UPDATE dbo.TestTable
SET TestName = 'Update 1'
WHERE TestTableID = 1;
GO

-- Step 4 - Ask the students what they would expect to see if you now tried to rollback the transaction

-- Step 5 - Then attempt to roll it back
--          (An error occurs as there is no transaction current)

ROLLBACK;
GO

-- Step 6 - Requery the table and show that the update was not rolled back

SELECT * FROM dbo.TestTable;
GO

-- Step 7 - Now try the same in a transaction

BEGIN TRANSACTION;

UPDATE dbo.TestTable
SET TestName = 'Update 2'
WHERE TestTableID = 2;

UPDATE dbo.TestTable
SET TestName = 'Update 3'
WHERE TestTableID = 3;
GO

-- Step 8 - Ask the students what they would expect to see if they queried the table from within this transaction

-- Step 9 - Select from the table to show them the result

SELECT * FROM dbo.TestTable;
GO

-- Step 10 - Ask the students what they would expect to see at this point in another query window that selected from the table

-- Step 11 - Switch to a second query window


-- Step 14 - Then back in the first window, roll the transaction back

ROLLBACK;
GO

-- Step 15 - Note that the second query window has completed
-- and the data in the same state it was prior to the transaction starting

-- Step 16 - Clean up

DROP TABLE dbo.TestTable;
GO

Inside the transaction the update will have worked, then rollback..

SET XACT_ABORT – in auto commit transaction, use this for multi lines to abort everything if any errors

image

image

image

| | # 
( 70-464 )
  • Inline
  • Multi statement

image

-- Demonstration 3A

-- Step 1 - Open a new query window against the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2 - Use the code from the earlier slide to create a table-valued function

CREATE FUNCTION Sales.GetLastOrdersForCustomer
(@CustomerID int, @NumberOfOrders int)
RETURNS TABLE
AS
RETURN (SELECT TOP(@NumberOfOrders)
                              soh.SalesOrderID,
                              soh.OrderDate,
                              soh.PurchaseOrderNumber
                FROM Sales.SalesOrderHeader AS soh
                WHERE soh.CustomerID = @CustomerID
                ORDER BY soh.OrderDate DESC
               );
GO

-- Step 3 - Query that function. It will return the last two
--          orders for customer 17288.

SELECT * FROM Sales.GetLastOrdersForCustomer(17288,2);
GO

-- Step 4 - Now show how CROSS APPLY could be used to call this
--          function (note that many students will not be familiar
--          with CROSS APPLY so you might wish to review its use,
--          particularly in relation to table-valued functions
--          Note as a matter of interest that if you scroll to
--          customer 11012, you will see a customer with less than
--          three orders. The function will still return these customers.

SELECT c.CustomerID,
             c.AccountNumber,
             glofc.SalesOrderID,
             glofc.OrderDate
FROM Sales.Customer AS c
CROSS APPLY Sales.GetLastOrdersForCustomer(c.CustomerID,3) AS glofc
ORDER BY c.CustomerID,glofc.SalesOrderID;

-- Step 5 - Drop the function

DROP FUNCTION Sales.GetLastOrdersForCustomer;
GO

Key Points

The code for views is incorporated directly into the code for the query that accesses the view. This is not
the case for scalar functions.

 

Common Performance Problems

The over-use of scalar functions is a common cause of performance problems in SQL Server systems. For
example, a WHERE clause predicate that calls a scalar function calls that function for every target row.
In many cases, extracting the code from the function definition and incorporating it directly into the
query will resolve the performance issue. You will see an example of this in the next lab.

image

image

| | # 
( 70-464 )

image

Must have a BEGIN and END block (not optional like a SP)

-- Demonstration 2A

-- Step 1 - Open a new query window against the tempdb database.

USE tempdb;
GO

-- Step 2 - Create a function
--          Note that SQL Server 2012 now includes a new function
--          for calculating the end of the current month (EOMONTH)

CREATE FUNCTION dbo.EndOfPreviousMonth (@DateToTest date)
RETURNS date
AS BEGIN
  RETURN DATEADD(day, 0 - DAY(@DateToTest), @DateToTest);
END;
GO

-- Step 3 - Query the function. The first query will return
--          the date of the end of last month. The second
--          query will return the date of the end of the
--          year 2009.

SELECT dbo.EndOfPreviousMonth(SYSDATETIME());
SELECT dbo.EndOfPreviousMonth('2010-01-01');
GO

-- Step 4 - Determine if the function is deterministic. The function
--          is not deterministic.

SELECT OBJECTPROPERTY(OBJECT_ID('dbo.EndOfPreviousMonth'),'IsDeterministic');
GO

-- Step 5 Question for students: SQL Server now includes
--        an EOMONTH function. How could you modify the function
--        above to use that new function?

-- Step 6 - Drop the function

DROP FUNCTION dbo.EndOfPreviousMonth;
GO

image
Function just returns the last day of the previous month

| | # 
( 70-464 )

Functions are routines that are used to encapsulate frequently performed logic. Rather than having to repeat all the function logic, any code that must perform the logic can call the function.

image

  • Scalar functions – return a single data type
  • Inline Table-valued functions – returns a table
  • Multi-statement Table-valued Functions – similar to SP’s..can be multiple SQL statements
  • System functions – cannot be modified
| | # 
( 70-464 )
-- Demonstration 3A

-- Step 1 - Open a new query window to the tempdb database

USE tempdb;
GO

-- Step 2 - Create a Sales and SalesDetails table

CREATE TABLE dbo.Sales
( SaleID int IDENTITY PRIMARY KEY,
  CustomerID int,
  PurchaseOrderNumber varchar(20),
  SoldWhen datetime2 DEFAULT SYSDATETIME()
);

CREATE TABLE dbo.SalesDetails
( SalesDetailID int IDENTITY,
  SaleID int REFERENCES dbo.Sales(SaleID),
  Description varchar(50),
  Price decimal(18,2)
);
GO

-- Step 3 - Create traditional insert stored procedures for both tables

CREATE PROCEDURE dbo.SalesInsert
  @CustomerID int,
  @PurchaseOrderNumber varchar(20),
  @SaleID int OUTPUT
AS BEGIN
  INSERT INTO dbo.Sales (CustomerID,PurchaseOrderNumber)
    VALUES(@CustomerID,@PurchaseOrderNumber);
  SELECT @SaleID = SCOPE_IDENTITY();
END;
GO

CREATE PROCEDURE dbo.SalesDetailInsert
  @SaleID int,
  @Description varchar(50),
  @Price decimal(18,2),
  @SalesDetailID int OUTPUT
AS BEGIN
  INSERT INTO dbo.SalesDetails (SaleID,Description,Price)
    VALUES(@SaleID,@Description,@Price);
  SELECT @SalesDetailID = SCOPE_IDENTITY();
END;
GO

-- Step 4 - Show how we would have previously inserted an order

DECLARE @SaleID int;
DECLARE @SalesDetailID int;

BEGIN TRAN;

EXEC dbo.SalesInsert 12,'2176A',@SaleID OUTPUT;

EXEC dbo.SalesDetailInsert @SaleID,'Product 1',12.3,@SalesDetailID OUTPUT
EXEC dbo.SalesDetailInsert @SaleID,'Product 2',14.6,@SalesDetailID OUTPUT
EXEC dbo.SalesDetailInsert @SaleID,'Product 3',122.35,@SalesDetailID OUTPUT

COMMIT;
GO

SELECT * FROM dbo.Sales;
SELECT * FROM dbo.SalesDetails;
GO

-- Step 5 - Create a table data type to hold the sales details

CREATE TYPE dbo.SalesDetails AS TABLE
( Description varchar(50),
  Price decimal(18,2)
);
GO

-- Step 6 - Modify the insert procedure to take detail lines as well

ALTER PROCEDURE dbo.SalesInsert
  @CustomerID int,
  @PurchaseOrderNumber varchar(20),
  @SalesDetails dbo.SalesDetails READONLY,
  @SaleID int OUTPUT
AS BEGIN
  BEGIN TRAN;
    INSERT INTO dbo.Sales (CustomerID,PurchaseOrderNumber)
       VALUES(@CustomerID,@PurchaseOrderNumber);
    SELECT @SaleID = SCOPE_IDENTITY();
    INSERT INTO dbo.SalesDetails (SaleID,Description,Price)
      SELECT @SaleID, Description,Price
      FROM @SalesDetails;
  COMMIT;
END;
GO

-- Step 7 - Perform an insert with a single round-trip

DECLARE @SaleID int;
DECLARE @SalesDetails dbo.SalesDetails;

INSERT INTO @SalesDetails VALUES('Product 1',12.3),('Product 2',14.66),('Product 3',122.35);
EXEC dbo.SalesInsert 12,'2176A',@SalesDetails,@SaleID OUTPUT;
GO

SELECT * FROM dbo.Sales;
SELECT * FROM dbo.SalesDetails;
GO

-- Step 8 - Query the sys.types and sys.table_types system views
--          and note the entries for SalesDetails

SELECT * FROM sys.types;
SELECT * FROM sys.table_types;
GO

Using TABLE type to pass multiple rows of the SalesDetails

Review Questions

1. What is the difference between SOURCE NOT MATCHED and TARGET NOT MATCHED in a MERGE statement?
2. What is a key advantage of the MERGE statement in terms of performance?

Best Practices

1. Use multi-row inserts when the rows being inserted are related in some way, for example, the detail rows of an invoice.
2. Consider making multiple-entity procedures instead of single-entity procedures to help minimize round trip behaviour and to reduce locking. For example, very minor changes are required to construct a stored procedure that can insert multiple sales orders compared to a stored procedure that can insert a single sales order.

| | # 
( 70-464 )

SQL Server 2000 introduced the TABLE data type for use as variables. SQL Server 2008 introduced the ability to declare these as permanent (or temporary) data types and to use them as parameters

image

image

-- Demonstraton 2A

-- Step 1 - Open a new query window to the MarketDev database on the Marketing server

USE MarketDev;
GO

-- Step 2 - Select from the dbo.StringListToTable function

DECLARE @CustomerList nvarchar(200);

SET @CustomerList = '12,15,99,214,228,917';

SELECT * FROM dbo.StringListToTable(@CustomerList,',');
GO

-- Step 3 - Try a different delimiter

DECLARE @CustomerList nvarchar(200);

SET @CustomerList = '12|15|99|214|228|917';

SELECT * FROM dbo.StringListToTable(@CustomerList,'|');
GO

-- Step 4 - Use the function in a join
--          Even though it "works", what is the issue?


DECLARE @CustomerList nvarchar(200);

SET @CustomerList = '12,15,99,214,228,917';

SELECT *
FROM dbo.StringListToTable(@CustomerList,',') AS sl
INNER JOIN Marketing.Prospect AS p
ON sl.StringValue = p.ProspectID
ORDER BY p.ProspectID;
GO

-- Step 5 - Execute a query with an incorrectly formed list
--          to discover the issue

DECLARE @CustomerList nvarchar(200);

SET @CustomerList = '12,15,99A,214,228,917';

SELECT *
FROM dbo.StringListToTable(@CustomerList,',') AS sl
INNER JOIN Marketing.Prospect AS p
ON sl.StringValue = p.ProspectID
ORDER BY p.ProspectID;
GO

Using a table value function – but brittle.

-- Demonstration 2B

-- Step 1 - Open a new query window to the tempdb database

USE tempdb;
GO

-- Step 2 - Many users have not worked with row constructors
--          so show a few common uses - start by creating a test table


CREATE TABLE dbo.TestTable
( TestTableID int IDENTITY,
  ValueA int,
  ValueB int
);
GO

-- Step 3 - Perform a multi-row insert

INSERT INTO TestTable (ValueA,ValueB)
  VALUES(1, 2), (3, 4), (5, 6);
GO

-- Step 4 - Show that sub-selects are now also
--          permitted in the values clause in INSERT statements

INSERT INTO TestTable (ValueA,ValueB)
  VALUES
   ((SELECT MIN(ValueA) FROM TestTable),(SELECT MAX(ValueA) FROM TestTable)),
   ((SELECT MIN(ValueB) FROM TestTable), 4);
GO

SELECT * FROM dbo.TestTable;
GO

-- Step 5 - Show that a VALUES clause can be used without
--          an INSERT statement to construct a table on
--          the fly. Mention that we will use this ability
--          in the last lesson in this module. Note also that
--          this example also uses the alternate method for
--          assigning column aliases (within the table alias
--          rather than in an AS clause after each column)

SELECT * FROM (VALUES(1), (2), (3), (4)) AS SomeTable(SomeColumn);
GO

-- Step 6 - Create the CustomerBalance table type

CREATE TYPE dbo.CustomerBalance
AS TABLE ( CustomerID int,
           CurrentBalance decimal(18,2));
GO

-- Step 7 - Declare a variable of type CustomerBalance and
--          populate it with 3 rows

DECLARE @Balance dbo.CustomerBalance;

INSERT INTO @Balance
  VALUES (12,14.50),(15,13.75),(22,19.50);

SELECT * FROM @Balance;
GO

-- Step 8 - Do the same operation with the SELECT statement
--          in a separate batch - note that like all variables
--          table variables are batch scoped and an error occurs

DECLARE @Balance dbo.CustomerBalance;

INSERT INTO @Balance
  VALUES (12,14.50),(15,13.75),(22,19.50);

GO
SELECT * FROM @Balance;
GO

TABLE types and Row Constructors

| | # 
# Tuesday, 17 September 2013
( 70-464 )

A very common requirement when coding in T-SQL is the need to update a row if it exists but to insert
the row if it does not already exist. Introduced in SQL Server 2008

It can operate on entire sets of data rather than just on single rows and can perform alternate actions such as deletes

Some other RDMS have UPSERT..MERGE is more capable

MERGE is atomic – all or nothing occurs

Target – table being modified and is specified first

Source – provides the rows. Specified in the USING.  Can be a table, view, sub-select, CTE, VALUES

Matched ON

-- Step 1 - Open a new query window to tempdb

USE tempdb;
GO

-- Step 2 - Create a master table and populate some rows

CREATE TABLE dbo.Employee
( EmployeeID int PRIMARY KEY,
  FullName nvarchar(40),
  EmploymentStatus int
);
GO

INSERT INTO dbo.Employee  (EmployeeID, FullName, EmploymentStatus)
  VALUES (1, 'David Hamilton',1);
INSERT INTO dbo.Employee  (EmployeeID, FullName, EmploymentStatus)
  VALUES (2, 'Nupur Argawal',1);
INSERT INTO dbo.Employee  (EmployeeID, FullName, EmploymentStatus)
  VALUES (3, 'Guido Pica',2);
INSERT INTO dbo.Employee  (EmployeeID, FullName, EmploymentStatus)
  VALUES (4, 'Kim Truelsen',1);
GO

-- Step 3 - Create a table of updates and populate some rows
--          Note that we are updating David Hamilton's employment
--          status and Kim Truelsen is now Kim Akers. Also, we
--          have a new employee

CREATE TABLE dbo.EmployeeUpdate
( EmployeeUpdateID int IDENTITY(1,1) PRIMARY KEY,
  EmployeeID int NOT NULL,
  FullName nvarchar(40) NULL,
  EmploymentStatus int NULL
);
GO

INSERT INTO dbo.EmployeeUpdate (EmployeeID,FullName,EmploymentStatus)
VALUES (1, 'David Hamilton',2);
INSERT INTO dbo.EmployeeUpdate (EmployeeID,FullName,EmploymentStatus)
VALUES (4, 'Kim Akers',1);
INSERT INTO dbo.EmployeeUpdate (EmployeeID,FullName,EmploymentStatus)
VALUES (5, 'Phyllis Harris',1);
GO

-- Step 4 - Perform the merge and note the use of the OUTPUT clause

MERGE INTO dbo.Employee AS e
USING dbo.EmployeeUpdate AS eu
ON e.EmployeeID = eu.EmployeeID
WHEN MATCHED THEN
    UPDATE SET e.FullName = eu.FullName,
               e.EmploymentStatus = eu.EmploymentStatus
WHEN NOT MATCHED THEN
    INSERT (EmployeeID, FullName, EmploymentStatus)
        VALUES (eu.EmployeeID, eu.FullName, eu.EmploymentStatus)
OUTPUT $action, inserted.EmployeeID, deleted.EmployeeID;
GO

image
A simple MERGE on an ID.  Very much ETL like with use of an update table.


-- Step 7 - Change the MERGE statement to only update columns
--          where a value has been provided

MERGE INTO dbo.Employee AS e
USING dbo.EmployeeUpdate AS eu
ON e.EmployeeID = eu.EmployeeID
WHEN MATCHED THEN
    UPDATE SET e.FullName = COALESCE(eu.FullName,e.FullName),
               e.EmploymentStatus = COALESCE(eu.EmploymentStatus,e.EmploymentStatus)
WHEN NOT MATCHED THEN
    INSERT (EmployeeID, FullName, EmploymentStatus)
        VALUES (eu.EmployeeID, eu.FullName, eu.EmploymentStatus)
OUTPUT $action, inserted.EmployeeID, deleted.EmployeeID;
GO
SELECT * FROM dbo.Employee;
GO

So if a FullName comes in with a NULL value then it wont update it nor EmployeeStatus.

The actions performed by a MERGE statement are not identical to those that would be performed by
separate INSERT, UPDATE or DELETE statements.

Performance of MERGE

The MERGE statement will often outperform code constructed from separate INSERT, UPDATE and
DELETE statements and conditional logic. In particular, the MERGE statement only ever makes a single
pass through the data in the target table.

| | # 
( 70-464 )

The security context that a stored procedure executes in is referred to as its execution context. This
context is used to establish the identity against which permissions to execute statements or perform
actions are checked.

image

-- Demonstration 4A

-- Step 1: Open a new query window to the tempdb database

USE tempdb;
GO

-- Step 2: Create a stored procedure that queries sys.login_token
--         and sys.user_token

CREATE PROC dbo.DisplayExecutionContext
AS
  SELECT * FROM sys.login_token;
  SELECT * FROM sys.user_token;
GO

-- Step 3: Execute the stored procedure and review the rowsets returned

EXEC dbo.DisplayExecutionContext;
GO

-- Step 4: Use the EXECUTE AS statement to change context

EXECUTE AS User = 'SecureUser';
GO

-- Step 5: Try to execute the procedure. Why doesn't it work?

EXEC dbo.DisplayExecutionContext;
GO

-- Step 6: Revert to the previous security context

REVERT;
GO

-- Step 7: Grant permission to SecureUser to execute the procedure

GRANT EXECUTE ON dbo.DisplayExecutionContext TO SecureUser;
GO

-- Step 8: Now try again and note the output

EXECUTE AS User = 'SecureUser';
GO

EXEC dbo.DisplayExecutionContext;
GO

REVERT;
GO

-- Step 9: Alter the procedure to execute as owner

ALTER PROC dbo.DisplayExecutionContext
WITH EXECUTE AS OWNER
AS
  SELECT * FROM sys.login_token;
  SELECT * FROM sys.user_token;
GO

-- Step 10: Execute as SecureUser again and note the difference

EXECUTE AS User = 'SecureUser';
GO

EXEC dbo.DisplayExecutionContext;
GO

REVERT;
GO

-- Step 11: Drop the procedure

DROP PROC dbo.DisplayExecutionContext;
GO
| | # 
( 70-464 )

image

-- Demonstration 3A

-- Step 1: Open a new query window to the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2: Drop the Production.GetBlueProducts and
--         Production.GetBlueProductsAndModels stored procedures

DROP PROC Production.GetBlueProducts;
DROP PROC Production.GetBlueProductsAndModels;
GO

-- Step 3: Replace the Production.GetBlueProductsAndModels with
--         a new stored procedure that takes a color parameter
--         Note that we can't use ALTER as we really also need to
--         change the name of the procedure

CREATE PROC Production.GetProductsAndModelsByColor
@Color nvarchar(15)
AS
BEGIN
  SELECT p.ProductID,
         p.Name,
         p.Size,
         p.ListPrice
  FROM Production.Product AS p
  WHERE p.Color = @Color
  ORDER BY p.ProductID;
  
  SELECT p.ProductID,
         pm.ProductModelID,
         pm.Name AS ModelName
  FROM Production.Product AS p
  INNER JOIN Production.ProductModel AS pm
  ON p.ProductModelID = pm.ProductModelID
  WHERE p.Color = @Color
  ORDER BY p.ProductID, pm.ProductModelID;
END;
GO

-- Step 4: Execute the new procedure

EXEC Production.GetProductsAndModelsByColor 'Red';
GO

-- Step 5: Now another bug report has come in saying
--         that the procedure works fine except for
--         products that don't have a color. We need
--         to test that situation

EXEC Production.GetProductsAndModelsByColor NULL;
GO

-- Step 6: We notice that no rows come back. Many products
--         do not have a color. Ask students how we
--         could fix the problem.

-- Step 7: The issue is that we can't equate NULL values
--         as they need IS NULL instead. So let's fix the
--         procedure.

ALTER PROC Production.GetProductsAndModelsByColor
@Color nvarchar(15)
AS
BEGIN
  SELECT p.ProductID,
         p.Name,
         p.Size,
         p.ListPrice
  FROM Production.Product AS p
  WHERE (p.Color = @Color) OR (p.Color IS NULL AND @Color IS NULL)
  ORDER BY p.ProductID;
  
  SELECT p.ProductID,
         pm.ProductModelID,
         pm.Name AS ModelName
  FROM Production.Product AS p
  INNER JOIN Production.ProductModel AS pm
  ON p.ProductModelID = pm.ProductModelID
  WHERE (p.Color = @Color) OR (p.Color IS NULL AND @Color IS NULL)
  ORDER BY p.ProductID, pm.ProductModelID;
END;
GO

-- Step 8: Now test the procedure again

EXEC Production.GetProductsAndModelsByColor 'Red';
GO
EXEC Production.GetProductsAndModelsByColor NULL;
GO

-- Step 9: Drop the stored procedure

DROP PROC Production.GetProductsAndModelsByColor;
GO
| | # 
( 70-464 )

BEGIN and END are optional but considered good practice

-- Demonstration 2A

-- Step 1: Open a new query window to the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2: Create the GetBlueProducts stored procedure

CREATE PROC Production.GetBlueProducts
AS
BEGIN
  SELECT p.ProductID,
         p.Name,
         p.Size,
         p.ListPrice
  FROM Production.Product AS p
  WHERE p.Color = N'Blue'
  ORDER BY p.ProductID;
END;
GO

-- Step 3: Execute the stored procedure

EXEC Production.GetBlueProducts;
GO

-- Step 4: Create the GetBlueProductsAndModels stored procedure

CREATE PROC Production.GetBlueProductsAndModels
AS
BEGIN
  SELECT p.ProductID,
         p.Name,
         p.Size,
         p.ListPrice
  FROM Production.Product AS p
  WHERE p.Color = N'Blue'
  ORDER BY p.ProductID;
  
  SELECT p.ProductID,
         pm.ProductModelID,
         pm.Name AS ModelName
  FROM Production.Product AS p
  INNER JOIN Production.ProductModel AS pm
  ON p.ProductModelID = pm.ProductModelID
  ORDER BY p.ProductID, pm.ProductModelID;
END;
GO

-- Step 5: Execute the GetBlueProductsAndModels stored procedure
--         Note in particular that multiple rowsets can be
--         returned from a single stored procedure execution

EXEC Production.GetBlueProductsAndModels;
GO

-- Step 6: Now tell the students that a bug has been
--         reported in the GetBlueProductsAndModels
--         stored procedure. See if they can find the
--         problem

-- Step 7: The problem is that the 2nd query doesn't also
--         check that the product is blue so let's alter
--         the stored procedure to fix this

ALTER PROC Production.GetBlueProductsAndModels
AS
BEGIN
  SELECT p.ProductID,
         p.Name,
         p.Size,
         p.ListPrice
  FROM Production.Product AS p
  WHERE p.Color = N'Blue'
  ORDER BY p.ProductID;
  
  SELECT p.ProductID,
         pm.ProductModelID,
         pm.Name AS ModelName
  FROM Production.Product AS p
  INNER JOIN Production.ProductModel AS pm
  ON p.ProductModelID = pm.ProductModelID
  WHERE p.Color = N'Blue'
  ORDER BY p.ProductID, pm.ProductModelID;
END;
GO

-- Step 8: And re-execute the GetBlueProductsAndModels stored procedure

EXEC Production.GetBlueProductsAndModels;
GO

-- Step 9: Query sys.procedures to see the list of procedures

SELECT SCHEMA_NAME(schema_id) AS SchemaName,
       name AS ProcedureName
FROM sys.procedures;
GO

| | # 
( 70-464 )

Views can call other views.  32 levels of nesting are allowed

CREATE VIEW HumanResources.EmployeeList
(EmployeeID, FamilyName, GivenName)
AS
SELECT EmployeeID, LastName, FirstName
FROM HumanResources.Employee;

ORDER BY can be used to satisfy a TOP clause.  Otherwise it doesn’t make any difference.

UPDATING

Updates that are performed on views cannot affect columns from more than one base table. Although this can be got around using INSTEAD OF triggers.

It is possible to update a view which then would make it disappear.  Use WITH CHECK OPTION to get around this.

WITH ENCRYPTION – obfuscates the view definition.  Not recommended.  Encryption is not very strong.

-- Demonstration 2A

-- Step 1 - Open a new query window to the AdventureWorks database

USE AdventureWorks;
GO

-- Step 2 - Create a new view

CREATE VIEW Person.IndividualsWithEmail
AS
SELECT ContactID, Title, FirstName, MiddleName, LastName
FROM Person.Contact AS c
WHERE c.EmailAddress IS NOT NULL;
GO

-- Step 3 - Query the view

SELECT * FROM Person.IndividualsWithEmail;
GO

-- Step 4 - Again query the view and order the results

SELECT *
FROM Person.IndividualsWithEmail
ORDER BY Title, ContactID;
GO

-- Step 5 - Query the view definition via OBJECT_DEFINITION

SELECT OBJECT_DEFINITION(OBJECT_ID(N'Person.IndividualsWithEmail',N'V'));
GO

-- Step 6 - Alter the view to use WITH ENCRYPTION

ALTER VIEW Person.IndividualsWithEmail
WITH ENCRYPTION
AS
SELECT ContactID, Title, FirstName, MiddleName, LastName
FROM Person.Contact AS c
WHERE c.EmailAddress IS NOT NULL;

-- Step 7 - Requery the view definition via OBJECT_DEFINITION

SELECT OBJECT_DEFINITION(OBJECT_ID(N'Person.IndividualsWithEmail',N'V'));
GO

-- Step 8 - Drop the view

DROP VIEW Person.IndividualsWithEmail;
GO

-- Step 9 - Script the existing HumanResources.vEmployee view
--          to a new query window and review its definition.
--          (In Object Explorer, click Connect, click Database Engine,
--           enter AdventureWorks as the Server Name and click Connect.
--           Expand Databases, expand AdventureWorks, expand Views. Right-click
--           HumanResources.vEmployee, click Script View As, click CREATE to,
--           click New Query Editor Window)
| | # 
( 70-464 )

Designing useful indexes is considered by many people as more of an art than a science. While there is some truth to this statement, a number of tools are available to assist with learning to create useful indexes

• Capture traces of activity using SQL Server Profiler
• Use Database Engine Tuning Advisor to analyze trace results

SQL Trace is a library of system SP’s that can be used for tracing, minimizing performance impacts

image
Saving to file.  Only DatabaseName=Adventureworks.  Did this by clicking on column name which allows filtering.

DETA

The Database Engine Tuning Advisor utility analyzes the performance effects of workloads run against one or more databases. Typically these workloads are obtained from traces captured by SQL Server Profiler.


After analyzing the effects of a workload on your databases, Database Engine Tuning Advisor provides recommendations for improving the performance of your system.

From profiler, tools, DETA.

image

Selected workload file, and also databases and tables to tune.

image

Right hand side showing the recommended index based on the worldload we ran on the left hand side.

Best Practices


1. Never apply Database Engine Tuning Advisor recommendations without further reviewing what is being suggested.


2. Record details of why and when you create any indexes. DBAs are hesitant to ever remove indexes without this knowledge.


3. When DETA suggests new statistics, this should be taken as a hint to investigate the indexing
structure of the table.

| | # 
( 70-464 )

You would use the INCLUDE to add one or more columns to the leaf level of a non-clustered index, if by doing so, you can "cover" your queries.

http://stackoverflow.com/questions/1307990/why-use-the-include-clause-when-creating-an-index

| | # 
# Wednesday, 04 September 2013
( 70-464 )

Tables can be stuctured as heaps or have clutered indexes.  Additional indexes are called non clustered.

image
Report on index use.

-- Demonstration 2A

-- Step 1: Reconnect this query window to the tempdb database on the AdventureWorks server.
--         (Right-click in whitespace, click Connection, click Change Connection,
--         then in the Connect to Server windows, enter AdventureWorks as the
--         server name and click Connect). Then use the tempdb database.

USE tempdb;
GO

-- Step 2: Create a table similar to the ones on the slide examples

CREATE TABLE dbo.Book
( ISBN nvarchar(20) PRIMARY KEY,
  Title nvarchar(50) NOT NULL,
  ReleaseDate date NOT NULL,
  PublisherID int NOT NULL
);
GO

-- Step 3: Create a nonclustered index on PublisherID and ReleaseDate DESC

CREATE NONCLUSTERED INDEX IX_Book_Publisher
  ON dbo.Book (PublisherID, ReleaseDate DESC);
GO

-- Step 4: Request an estimated execution plan for a query that needs lookups

SELECT PublisherID, Title, ReleaseDate
FROM dbo.Book
WHERE ReleaseDate > DATEADD(year,-1,SYSDATETIME())
ORDER BY PublisherID, ReleaseDate DESC;
GO

-- Step 5: Replace the index with one that includes the Title column

CREATE NONCLUSTERED INDEX IX_Book_Publisher
  ON dbo.Book (PublisherID, ReleaseDate DESC)
  INCLUDE (Title)
  WITH DROP_EXISTING;
GO

-- Step 6: Again, request an estimated execution plan for the query

SELECT PublisherID, Title, ReleaseDate
FROM dbo.Book
WHERE ReleaseDate > DATEADD(year,-1,SYSDATETIME())
ORDER BY PublisherID, ReleaseDate DESC;
GO

-- Step 7: Use the AdventureWorks Database

USE AdventureWorks;
GO

-- Step 8: Query the sys.index_columns system view

SELECT * FROM sys.index_columns;
GO

-- Step 9: Note the is_included_column column, the key_ordinal column
--         and the is_descending_key column

-- Step 10: Combine several system views in a query
--          to locate any included columns in the database

SELECT s.name AS SchemaName,
       OBJECT_NAME(i.object_id) AS TableOrViewName,
       i.name AS IndexName,
       c.name AS ColumnName
FROM sys.indexes AS i
INNER JOIN sys.index_columns AS ic
ON i.object_id = ic.object_id
INNER JOIN sys.columns AS c
ON ic.object_id = c.object_id
AND ic.column_id = c.column_id
INNER JOIN sys.objects AS o
ON i.object_id = o.object_id
INNER JOIN sys.schemas AS s
ON o.schema_id = s.schema_id
WHERE ic.is_included_column <> 0
AND s.name <> 'sys'
ORDER BY SchemaName, TableOrViewName, i.index_id, ColumnName;

-- Step 11: Use object explorer to view the properties of
--          the IX_ProductReview_ProductID_Name index on
--          the Production.ProductReview table. Note the structure
--          of the index and the included column

-- Step 12: Now review the properties of the other index from that
--          same table that appeared in our list of indexes with
--          included columns. Note that by definition, included
--          columns only apply to nonclustered indexes.

image
Have put on 1 non clustered, now replace


-- Step 5: Replace the index with one that includes the Title column

CREATE NONCLUSTERED INDEX IX_Book_Publisher
  ON dbo.Book (PublisherID, ReleaseDate DESC)
  INCLUDE (Title)

image

| | # 
( 70-464 )

SQL exposes dynamic management views (DMV’s) to explore query plan reuse.

image
Can view execution plan in Activity Monitor

Cached plans

-- Demonstration 3B

-- Step 1: Open a new query window to tempdb

USE tempdb;
GO

-- Step 2: Execute a query to retrieve existing query plans

SELECT cp.objtype AS PlanType,
       OBJECT_NAME(st.objectid,st.dbid) AS ObjectName,
       cp.refcounts AS ReferenceCounts,
       cp.usecounts AS UseCounts,
       st.text AS SQLBatch,
       qp.query_plan AS QueryPlan
FROM sys.dm_exec_cached_plans AS cp
CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) AS qp
CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) AS st;

-- Step 3: Clear the plan cache

DBCC FREEPROCCACHE;
GO

-- Step 4: Execute a query to retrieve existing query plans

SELECT cp.objtype AS PlanType,
       OBJECT_NAME(st.objectid,st.dbid) AS ObjectName,
       cp.refcounts AS ReferenceCounts,
       cp.usecounts AS UseCounts,
       st.text AS SQLBatch,
       qp.query_plan AS QueryPlan
FROM sys.dm_exec_cached_plans AS cp
CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) AS qp
CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) AS st;

-- Step 5: Click on one of the XML values returned
--         to a plan

image
Before the freeproccache.  340 rows cached.

image
after.. nothing in cache really.

Review Questions
1. What is the difference between a graphical execution plan and an XML execution plan?
2. Give an example of why a T-SQL DELETE statement could have a complex execution plan?


Best Practices
1. Avoid capturing execution plans for large numbers of statements when using SQL Profiler.
2. If you need to capture plans using Profiler, make sure the trace is filtered to reduce the number of
events being captured.

| | # 
( 70-464 )

Execution plans are also called query plans

Show how SQL intens to execute a query, or how it did.

image
Actual execution plan.

image
Estimated execution plan – there is no index on color.

Execution Context

| | # 
( 70-464 )

PRIMARY KEY is a constraint.  It is a logical concept that is supported by an index but the index may or may not be clustered.  Default is when a PK Constraint is added, to make a clustered PK if no other clustered index already exists on the table.

WITH DROP_EXISTING

alows a new index to be created using the older one.

FILLFACTOR and PAD_INDEX options are used to provide free space within index pages.  Can improve INSERT and UPDATE, often to the detriment of SELECT

CTRL SHIFT R – Refresh Intellisense

Characteristics of Good Clustering Key

  • Short
  • Static
  • Increasing
  • Unique. 

image

Review Questions

1. What is the main problem with uniqueidentifiers used as primary keys?
2. Where are newly inserted rows placed when a table is structured as a heap?

Best Practices

  • Unless specific circumstances arise, most tables should have a clustered index.
  • The clustered index may or may not be placed on the table's primary key.
  • When using GUID primary keys in the logical data model, consider avoiding their use throughout the physical implementation of the data model.

Application has a clustered index on ApplicationID (INT) IDENTITY – therefore unique.

Email has clustered index on EmailAddress

Reference has clustered index on ReferenceID (GUID)

image
Slower insert times with worse types

USE MarketDev;
GO

CREATE TABLE Relationship.Table_Heap
( ApplicationID INT IDENTITY(1,1),
  ApplicantName NVARCHAR(150),
  EmailAddress NVARCHAR(100),
  ReferenceID UNIQUEIDENTIFIER,
  Comments NVARCHAR(500)
);
GO

CREATE TABLE Relationship.Table_ApplicationID
( ApplicationID INT IDENTITY(1,1),
  ApplicantName NVARCHAR(150),
  EmailAddress NVARCHAR(100),
  ReferenceID UNIQUEIDENTIFIER,
  Comments NVARCHAR(500)
);
GO
CREATE CLUSTERED INDEX IX_ApplicantID
  ON Relationship.Table_ApplicationID (ApplicationID);
GO

CREATE TABLE Relationship.Table_EmailAddress
( ApplicationID INT IDENTITY(1,1),
  ApplicantName NVARCHAR(150),
  EmailAddress NVARCHAR(100),
  ReferenceID UNIQUEIDENTIFIER,
  Comments NVARCHAR(500)
);
GO
CREATE CLUSTERED INDEX IX_EmailAddress
  ON Relationship.Table_EmailAddress (EmailAddress);
GO

CREATE TABLE Relationship.Table_ReferenceID
( ApplicationID INT IDENTITY(1,1),
  ApplicantName NVARCHAR(150),
  EmailAddress NVARCHAR(100),
  ReferenceID UNIQUEIDENTIFIER,
  Comments NVARCHAR(500)
);
GO
CREATE CLUSTERED INDEX IX_ReferenceID
  ON Relationship.Table_ReferenceID (ReferenceID);
GO

 

--Module 06 - Test Workload for Exercise 3

USE MarketDev;
GO

DECLARE @Counter INT;
DECLARE @StartTime DATETIME2;
DECLARE @HeapFinish DATETIME2;
DECLARE @ApplicationIDFinish DATETIME2;
DECLARE @EmailAddressFinish DATETIME2;
DECLARE @ReferenceIDFinish DATETIME2;

SET NOCOUNT ON;

SET @StartTime = SYSDATETIME();
SET @Counter = 0;

WHILE (@Counter < 20) BEGIN
  INSERT Relationship.Table_Heap
    (ApplicantName, EmailAddress, ReferenceID, Comments)
  SELECT p.FirstName + ' ' + p.LastName,
         p.EmailAddress,
         NEWID(),
         REPLICATE('A',500)
  FROM Marketing.Prospect AS p;
  SET @Counter += 1;
END;

SET @HeapFinish = SYSDATETIME();

PRINT 'Heap loaded';

--select * from Relationship.Table_Heap

SET @Counter = 0;

WHILE (@Counter < 20) BEGIN
  INSERT Relationship.Table_ApplicationID
    (ApplicantName, EmailAddress, ReferenceID, Comments)
  SELECT p.FirstName + ' ' + p.LastName,
         p.EmailAddress,
         NEWID(),
         REPLICATE('A',500)
  FROM Marketing.Prospect AS p;
  SET @Counter += 1;
END;

SET @ApplicationIDFinish = SYSDATETIME();

PRINT 'ApplicationID Table Loaded';

SET @Counter = 0;

WHILE (@Counter < 20) BEGIN
  INSERT Relationship.Table_EmailAddress
    (ApplicantName, EmailAddress, ReferenceID, Comments)
  SELECT p.FirstName + ' ' + p.LastName,
         p.EmailAddress,
         NEWID(),
         REPLICATE('A',500)
  FROM Marketing.Prospect AS p;
  SET @Counter += 1;
END;

SET @EmailAddressFinish = SYSDATETIME();

PRINT 'EmailAddress Table Loaded';

SET @Counter = 0;

WHILE (@Counter < 20) BEGIN
  INSERT Relationship.Table_ReferenceID
    (ApplicantName, EmailAddress, ReferenceID, Comments)
  SELECT p.FirstName + ' ' + p.LastName,
         p.EmailAddress,
         NEWID(),
         REPLICATE('A',500)
  FROM Marketing.Prospect AS p;
  SET @Counter += 1;
END;

SET @ReferenceIDFinish = SYSDATETIME();

PRINT 'ReferenceID Table Loaded';

SELECT DATEDIFF(second,@StartTime,@HeapFinish) AS HeapTime,
       DATEDIFF(second,@HeapFinish,@ApplicationIDFinish) AS ApplicationIDTime,
       DATEDIFF(second,@ApplicationIDFinish,@EmailAddressFinish) AS EmailAddressTime,
       DATEDIFF(second,@EmailAddressFinish,@ReferenceIDFinish) AS ReferenceIDTime;
| | # 
( 70-464 )

Composite Indexes - constructed on multi columns

Often more useful in business apps:

  • Higher selectivity
  • Possibility of avoiding the need to sort the output rows

Ascending vs Descending

For single column, both equally useful.  Makes no difference

For composite.. used to avoid sorts

Statistics

Keeps statistics on indexes

image
An IDENTITY (PK) and GUID.  Index on email address too.

USE AdventureWorks;
GO

select * from person.Contact

-- Step 2: Explore the existing statistics on the Person.Contact table.
--         There may or may not be statistics available. If they are
--         present and have a _WA prefix, they are autostats.

SELECT * FROM sys.stats WHERE object_id = OBJECT_ID('Person.Contact');
GO

-- Step 3: Create statistics on the FirstName column in the Person.Contact table

CREATE STATISTICS Person_FirstName_Stats ON Person.Contact (FirstName)
WITH FULLSCAN;
GO

-- Step 4: Check the list of statistics again

SELECT * FROM sys.stats WHERE object_id = OBJECT_ID('Person.Contact');
GO

-- Step 5: Use DBCC SHOW_STATISTICS to obtain details of the new statistics

DBCC SHOW_STATISTICS('Person.Contact',Person_FirstName_Stats);
GO

-- Step 6: Note how many rows exist for the name Abigail

-- Step 7: Execute a query to count the rows to see if the statistics are accurate

SELECT COUNT(1) FROM Person.Contact WHERE FirstName = 'Abigail';
GO

Have created Statistics for Person.Contact (FirstName)

image
Creating statistics.  The funny numbers are created when a ran a query

  • Design indexes to maximize sensitivity which leads to lower I/O.
  • In absence of other requirements, aim to have the most selective columns first in composite indexes.
| | # 
( 70-464 )

Indexes with numeric keys work efficiently int and bigint

Character based indexes are both common and useful

Date are good for index keys.. only slightly less efficient than int.  Date more efficient than datetime

GUID – reasonably efficient.  However randomness causes fragmentation

BIT – useful.  An index that uses IsFinalised.. 500 out of 100million are unfinalised.  Makes a good candidate for index.  However entirely useless for finding finalised transactions.

Indexing Computed Columns – eg return sales by da of week.. computer col that determinsed day of week baed of date of the sale and then index that comptuer column.

| | # 
# Tuesday, 03 September 2013
( 70-464 )

PK Constraints

image
Much nicer to name Constraints.

image

CREATE TABLE dbo.Customer
(
    --CustomerID int IDENTITY(1,1) CONSTRAINT PK_Customer PRIMARY KEY,
    CustomerID int IDENTITY(1,1) PRIMARY KEY,
    CustomerName nvarchar(50) NOT NULL
);

Natural vs Surrogate keys.

UNIQUE Constraints

A UNIQUE constraint is used to ensure that more than one row does not have a single value eg Spanish DNI – national identity.

FK Constraints

image

 

--Demonstration 3A - Entity and Referential Integrity

-- Step 1: Open a new query window to tempdb

USE tempdb;
GO

--drop table CustomerOrder
--drop table customer

-- Step 2: Create the Customer and CustomerOrder tables
--         and populate them

CREATE TABLE dbo.Customer
(
    --CustomerID int IDENTITY(1,1) CONSTRAINT PK_Customer PRIMARY KEY,
    CustomerID int IDENTITY(1,1) PRIMARY KEY,
    CustomerName nvarchar(50) NOT NULL
);
GO
INSERT dbo.Customer
  VALUES (' Marcin Jankowski'),('Darcy Jayne');
GO

select * from customer

CREATE TABLE dbo.CustomerOrder
(
    CustomerOrderID int IDENTITY(1000001,1) PRIMARY KEY,
    CustomerID int NOT NULL
      FOREIGN KEY REFERENCES dbo.Customer (CustomerID),
    OrderAmount decimal(18,2) NOT NULL
);
GO

-- Step 3: Select the list of customers and
--         perform a valid insert into the CustomerOrder table

SELECT * FROM dbo.Customer;
GO

INSERT INTO dbo.CustomerOrder (CustomerID, OrderAmount)
  VALUES (1, 12.50), (2, 14.70);
GO
SELECT * FROM dbo.CustomerOrder;
GO

-- Step 4: Try to insert a CustomerOrder row for an invalid customer
--         Note how poor the error messages look when constraints are
--         not named appropriately

INSERT INTO dbo.CustomerOrder (CustomerID, OrderAmount)
  VALUES (3, 15.50);
GO

-- Step 5: Try to remove a customer that has an order
--         Again note how the poor naming doesnt help much.

DELETE FROM dbo.Customer WHERE CustomerID = 1;
GO

-- Step 6: Remove the foreign key constraint and
--         replace it with a named constraint with cascade.
--         Note that you will need to copy into this code the
--         name of the constraint returned in the error from the
--         previous statement. This is part of the problem
--         when constraints are not named.

ALTER TABLE dbo.CustomerOrder
  DROP CONSTRAINT FK__CustomerO__Custo__66603565;
GO

ALTER TABLE dbo.CustomerOrder
  ADD CONSTRAINT FK_CustomerOrder_Customer
  FOREIGN KEY (CustomerID)
  REFERENCES dbo.Customer (CustomerID)
  ON DELETE CASCADE;
GO

-- Step 7: Select the list of customer orders, try a delete again
--         and note that the delete is now possible

SELECT * FROM dbo.CustomerOrder;
GO
DELETE FROM dbo.Customer WHERE CustomerID = 1;
GO

-- Step 8: Note how the cascade option caused the orders for the
--         deleted customer to also be deleted

SELECT * FROM dbo.Customer;
SELECT * FROM dbo.CustomerOrder;
GO

-- Step 9: Try to drop the referenced table and note the error:
DROP TABLE dbo.Customer;
GO

WITH NOCHECK – does not apply the check to existing rows

Cascading Referential Integrity – caution.

Considerations for Constraint Checking

Bulk loading or updating data – disabling constraints if often more performant ie CHECK and FK. Then re-enabling.

IDENTITY

Value will not be provided by INSERT, but should be provided by SQL Server.

Typically integer or bigint

  • seed – starting value
  • increment – how much goes up each time

SET IDENTITY_INSERT ON allows user to insert values

-- Step 2: Create the dbo.Opportunity table

CREATE TABLE dbo.Opportunity
(
  OpportunityID int NOT NULL
    IDENTITY(1,1),
  Requirements nvarchar(50) NOT NULL,
  ReceivedDate date NOT NULL,
  LikelyClosingDate date NULL,
  SalespersonID int NULL,
  Rating int NOT NULL
);

-- Step 3: Populate the table with 2 rows

INSERT INTO dbo.Opportunity
  (Requirements, ReceivedDate, LikelyClosingDate,
   SalespersonID,Rating)
VALUES ('n.d.', SYSDATETIME(), DATEADD(month,1,SYSDATETIME()), 34,9),
       ('n.d.', SYSDATETIME(), DATEADD(month,1,SYSDATETIME()), 37,2);

select @@IDENTITY

-- Step 4: Check the identity values added

SELECT * FROM dbo.Opportunity;

image
Select identity giving only the last inserted identity.

SCOPE_IDENTITY() used if a trigger wrote to an auditing table

For multi row inserts we’d typically use an OUTPUT clause in the INSERT statement.

SEQUENCE

Sequences are not tied to a specific table.  Can provide key values for a group of tables.

image
Disparate tables.. but would be useful to have unique ID’s

-- Step 12: Create a sequence to use with all 3 tables

CREATE SEQUENCE dbo.BookingID AS INT
  START WITH 1000001
  INCREMENT BY 1;
GO

-- Step 13: Recreate the tables using the sequence for default values

CREATE TABLE dbo.FlightBookings
( FlightBookingID INT NOT NULL
    CONSTRAINT DF_FlightBookings_FlightBookingID DEFAULT (NEXT VALUE FOR dbo.BookingID)
    CONSTRAINT PK_FlightBookings PRIMARY KEY,
  DepartureAirportCode VARCHAR(4) NOT NULL,
  ArrivalAirportCode VARCHAR(4) NOT NULL,
  CustomerID INT NOT NULL,
  DepartingAt DATETIMEOFFSET(0) NOT NULL,
  EstimatedArrival DATETIMEOFFSET(0) NOT NULL,
  IataAirlineCode VARCHAR(2) NOT NULL,
  FlightNumber VARCHAR(4) NOT NULL,
  ShortBookingDescription AS 'Flight-' + CAST(CustomerID AS VARCHAR(8))
    + '-' + DepartureAirportCode + '-' + ArrivalAirportCode
    + '-' + IataAirlineCode + FlightNumber
);

CREATE TABLE dbo.CarBookings
( CarBookingID INT NOT NULL
    CONSTRAINT DF_CarBookings_CarBookingID DEFAULT (NEXT VALUE FOR dbo.BookingID)
    CONSTRAINT PK_CarBookings PRIMARY KEY,
  CollectionPointCode VARCHAR(3) NOT NULL,
  ReturnPointCode VARCHAR(3) NOT NULL,
  CustomerID INT NOT NULL,
  CollectingAt DATETIMEOFFSET(0) NOT NULL,
  ReturningBy DATETIMEOFFSET(0) NOT NULL,
  VehicleTypeID INT NOT NULL,
  ShortBookingDescription AS 'Car-' + CAST(CustomerID AS VARCHAR(8))
    + '-' + CollectionPointCode + '-' + ReturnPointCode
    + '-' + CAST(VehicleTypeID AS VARCHAR(8))
);

CREATE TABLE dbo.HotelBookings
( HotelBookingID INT NOT NULL
    CONSTRAINT DF_HotelBookings_HotelBookingID DEFAULT (NEXT VALUE FOR dbo.BookingID)
    CONSTRAINT PK_HotelBookings PRIMARY KEY,
  HotelCode VARCHAR(5) NOT NULL,
  CustomerID INT NOT NULL,
  CheckinTime DATETIMEOFFSET(0) NOT NULL,
  CheckoutTime DATETIMEOFFSET(0) NOT NULL,
  RoomTypeId INT NOT NULL,
  ShortBookingDescription AS 'Hotel-' + CAST(CustomerID AS VARCHAR(8))
    + '-' + HotelCode + '-' + CAST(RoomTypeID AS VARCHAR(4))
);

-- Step 14: Reinsert the same data

INSERT dbo.FlightBookings
  ( DepartureAirportCode, ArrivalAirportCode, CustomerID,
    DepartingAt, EstimatedArrival, IataAirlineCode, FlightNumber)
VALUES ('MEL','LAX', 4, '2011-07-30 13:30:00 +10:00', '2011-07-30 07:45:00 -8:00', 'QF', '25'),
       ('LAX','SEA', 3, '2011-07-30 16:30:00 -8:00', '2011-07-30 18:00:00 -8:00', 'AS', '245');

INSERT dbo.CarBookings
  ( CollectionPointCode, ReturnPointCode, CustomerID,
    CollectingAt, ReturningBy, VehicleTypeID )
VALUES ( 'MEL', 'BNE', 8, '2011-07-30 13:30:00 +10:00', '2011-07-31 07:45:00 +10:00', 6),
       ( 'LAX', 'LAX', 7, '2011-07-31 13:30:00 +10:00', '2011-07-31 07:45:00 -8:00', 6);

INSERT dbo.HotelBookings
  ( HotelCode, CustomerID, CheckinTime, CheckoutTime, RoomTypeId )
VALUES ( 'HLMEL', 15, '2011-07-30 13:30:00 +10:00', '2011-07-31 07:45:00 +10:00', 4);

INSERT dbo.FlightBookings
  ( DepartureAirportCode, ArrivalAirportCode, CustomerID,
    DepartingAt, EstimatedArrival, IataAirlineCode, FlightNumber)
VALUES ('MEL','LAX', 4, '2011-07-31 13:30:00 +10:00', '2011-07-31 07:45:00 -8:00', 'QF', '25'),
       ('LAX','SEA', 6, '2011-07-31 16:30:00 -8:00', '2011-07-31 18:00:00 -8:00', 'AS', '245');

INSERT dbo.CarBookings
  ( CollectionPointCode, ReturnPointCode, CustomerID,
    CollectingAt, ReturningBy, VehicleTypeID )
VALUES ( 'MEL', 'BNE', 9, '2011-08-30 13:30:00 +10:00', '2011-08-31 07:45:00 +10:00', 6),
       ( 'LAX', 'LAX', 8, '2011-08-31 13:30:00 +10:00', '2011-08-31 07:45:00 -8:00', 6);
       
INSERT dbo.HotelBookings
  ( HotelCode, CustomerID, CheckinTime, CheckoutTime, RoomTypeId )
VALUES ( 'HLLAX', 13, '2011-07-30 13:30:00 -6:00', '2011-07-31 07:45:00 -6:00', 4);

-- Step 15: Note the values now appearing in the view

SELECT FlightBookingID AS BookingID, ShortBookingDescription
FROM dbo.FlightBookings
UNION ALL
SELECT CarBookingID, ShortBookingDescription
FROM dbo.CarBookings
UNION ALL
SELECT HotelBookingID, ShortBookingDescription
FROM dbo.HotelBookings
ORDER BY BookingID;
GO

 

image
Using a Sequence for BookingID

-- Step 16: Note that sequence values can be created on the fly

SELECT NEXT VALUE FOR dbo.BookingID AS ID,
       *
FROM dbo.FlightBookings;
GO
| | # 
( 70-464 )

Domain integrity limits the range and type of values that can be stored in a column. It is usually the most
important form of data integrity when first designing a database.

Data Types

  • system,
  • alias eg weight could be decimal(18,3) or decimal(12,2).. eg created ProductWeight
  • User-defined (.NET)

Column Nullability – NULL is a lack of a value.  Eg lastSaleDate

DEFAULT Constraints

Provide default values for columns

CHECK Constraints

Limits the values that are accepted into a column eg varchar(7) must be 5 character long if first character is letter A.

Table-Level CHECK Constraints – eg FromDate column should not have a larger date than ToDate in the same row.

CREATE TABLE dbo.ProspectiveHire
(
    ProspectiveHireID int NOT NULL,
    GivenNames nvarchar(30) NOT NULL,
    FamilyName nvarchar(30) NULL,
    FullName nvarchar(50) NOT NULL,
    CountryCode nchar(3) NOT NULL
      CONSTRAINT CHK_ProspectiveHire_CountryCode_Length3
      CHECK (LEN(CountryCode) = 3),
    StateOrRegion nvarchar(20) NULL,
    DateOfBirth date NOT NULL
      CONSTRAINT CHK_ProspectiveHire_DateOfBirth_NotFuture
      CHECK (DateOfBirth < SYSDATETIME()),
    LikelihoodOfJoining int NOT NULL
      CONSTRAINT CHK_ProspectiveHire_LikelihoodOfJoining_Range1To5
      CHECK (LikelihoodOfJoining BETWEEN 1 AND 5)
      CONSTRAINT DF_ProspectiveHire_LikelihoodOfJoining
      DEFAULT (3),
    CONSTRAINT CHK_ProspectiveHire_US_States_Length2
    CHECK (CountryCode <> N'USA' OR LEN(StateOrRegion) = 2)
);
GO

-- Step 4: Execute statements to test the actions of the integrity constraints

-- INSERT a row providing all values ok:

INSERT INTO dbo.ProspectiveHire
  ( ProspectiveHireID, GivenNames, FamilyName, FullName,
    CountryCode, StateOrRegion, DateOfBirth, LikelihoodOfJoining)
  VALUES (1, 'Jon','Jaffe','Jon Jaffe',
          'USA', 'WA', '19730402', 4);
GO
SELECT * FROM dbo.ProspectiveHire;
GO

-- Step 5: INSERT rows that test the nullability and constraints

-- INSERT a row that fails a nullability test

INSERT INTO dbo.ProspectiveHire
  ( ProspectiveHireID, GivenNames, FamilyName,
    CountryCode, StateOrRegion, DateOfBirth, LikelihoodOfJoining)
  VALUES (2, 'Jacobsen','Lola',
          'USA', 'W', '19730405', 2);
GO

-- INSERT a row that fails the country code length test

INSERT INTO dbo.ProspectiveHire
  ( ProspectiveHireID, GivenNames, FamilyName, FullName,
    CountryCode, StateOrRegion, DateOfBirth, LikelihoodOfJoining)
  VALUES (2, 'Jacobsen','Lola', 'Lola Jacobsen',
          'US', 'WA', '20600405', 2);
GO

-- INSERT a row that fails the date of birth test

INSERT INTO dbo.ProspectiveHire
  ( ProspectiveHireID, GivenNames, FamilyName, FullName,
    CountryCode, StateOrRegion, DateOfBirth, LikelihoodOfJoining)
  VALUES (2, 'Jacobsen','Lola', 'Lola Jacobsen',
          'USA', 'WA', '20600405', 2);
GO

-- INSERT a row that fails the US State length test

INSERT INTO dbo.ProspectiveHire
  ( ProspectiveHireID, GivenNames, FamilyName, FullName,
    CountryCode, StateOrRegion, DateOfBirth, LikelihoodOfJoining)
  VALUES (2, 'Jacobsen','Lola', 'Lola Jacobsen',
          'USA', 'W', '19730405', 2);
GO

Then


-- Step 5: Query sys.sysconstraints to see the list of constraints
--         that have been cataloged

SELECT OBJECT_NAME(id) AS TableName,
       OBJECT_NAME(constid) AS ConstraintName,
       *
FROM sys.sysconstraints;
GO

 

image

| | # 
( 70-464 )

Where to enforce business rules

  • UI – gives instant feedback
  • Middle
  • Data

Usually want to enforce at multiple levels.

3 basic forms of data integrity enforced in db applications:

  • domain (column) integrity – eg tinyint enforces 0 to 255
  • entity integrity – eg PK
  • referential – enforces realtionships are always maintained

image

| | # 
( 70-464 )

Since 2005, schemas are used as containers.  Just like a folder.

image
Permissions can be assigned at the schema level which can simplify greatly system security

If schema name is omitted, rules apply.

| | # 
# Thursday, 29 August 2013
( 10776A | 70-464 | HyperV )

Benefits of using MSTSC:

  • Allows full screen
  • Copy and paste easily

Setup networking to be dynamic IP on Win Server:

image

Allow remote desktop: (right click on start, computer, advanced)

image

Shutdown.  Add Hardware for VM

image
Add Hardware, Network Adapter, and give it the correct Virtual Switch.  Here it is full LAN which allows access to the internet.

Also changed RAM usage to dynamic.

Fire up the VM and find it’s IP address

image
Test connecting this way

image
Or use the computer’s name

image
MSTSC and copy paste working.  And internet access from the Win2K8 server.

Wireless

If have 2 connetions then need another virtual switch

image
then ‘wire’ up:

image

image
Then can connect via machine name.  Notice IP address is from DHCP on the router.

| | #