Wednesday, December 28, 2011

Silverlight PivotViewer

Silverlight PivotViewer is an amazing control to visualize large data.  It is highly adopted in many business scenarios.  The Channel9 talk touches key features of the control.

It makes use of DeepZoom technology to visualized large data sets.  it is a closed control, due to that we cannot applying styling in a direct way. 

Extending Control

The workaround is to understand the object hierarchy of the control and Extending the control with accessible interface of the underlying controls.  Using this technique you can customize visual appearance of the control, refer XPert360 blog.

To get the reference of existing object model you can use the SilverlightSpy.  You can refer the sample code here for exposing wrapper.  To take step by step lesson to customize the control refer the CodePlex Lessons in PivotViewer.

In addition to adding styles to the control, we can add new control or view to the control, you can refer Roger Noble blog for the same.  You can make use of MVVM pattern while using PivotViewer, refer Timmy Kokke blog for the same.

Collections

To use the power of PivotViewer, you must have solid collections in place.  There are array of tools to create collections.  PAuthor is my favorite tool, in addition to HTML Template based image generation, you can build your own extensions to build image using WPF controls.  For real time data, JIT collections are ideal.  We can enhance the JIT collection library based on your needs.

Sample Collections

There is a great collection of TechEd and PASS events as Pivot collections listed here.  We can also use Desktop tool to explore the public collections like these.

SSIS Programming reference

Generating SSIS using meta data is a powerful technique when you want to move large number of table from source to target.  In this post I will highlight the programming references to create the SSIS dynamically.
image
  • Microsoft.SqlServer.Dts.Runtime.Package is the highlevel object which hold all the objects of the package.  You can create New Package object to start building you object mode.
  • Connections collection holds connections. 
    Example: PackageObj.Connections.Add("OLEDB");
    Refer MSDN for list of connection types
  • You can change the custom property of the connection by accessing the Property collection of the connection
    Example: ConnectionObj.Properties["Format"].SetValue(csvFile, "Delimited");
  • Executable: By accessing the Executable collection you can add DFT, SQL Task or any other task to Package.
    Example: Package.Executables.Add(“STOCK:SEQUENCE”);
    For list of moniker, refer the SQLIS blog
  • When you are dealing with Data Flow Task (DFT), it is complex due to the fact DFT are COM and we have a CManagedComponentWrapper to access them.
  • The hierarchy of object structure is
    • Executable
      • TaskHost
        • MainPipe
          • IDTSComponentMetaData100
Package p = new Package();
Executable e = p.Executables.Add("DTS.Pipeline.1");
TaskHost thMainPipe = e as TaskHost;
MainPipe dataFlowTask = thMainPipe.InnerObject as MainPipe;

  • In DFT, you can add pipeline items by using ComponentMetaDataCollection of MainPipe object
IDTSComponentMetaData100 pipeLineItem = MainPipeObj.ComponentMetaDataCollection.New();
pipeLineItem.ComponentClassID = "DTSAdapter.OleDbSource.2";

  • When you are using the Source object in PipeLine, make sure it get initiated using IDTSComponentMetaData100.Instantiate & CManagedComponentWrapper.ProvideComponentProperties methods.  These methods are explicit interface implementation so make sure necessary type cast done when you call the methods.
CManagedComponentWrapper InstanceSource = IDTSComponentMetaData100Obj.Instantiate();
InstanceSource.ProvideComponentProperties();

  • For Source, Destination tasks make sure the connection are associated.
public static void SetConnection(
IDTSComponentMetaData100 obj, ConnectionManager con)
{
obj.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(con);
obj.RuntimeConnectionCollection[0].ConnectionManagerID = con.ID;
}

  • After the connection is set, it needs to be refreshed to get the meta data
CManagedComponentWrapperObj.AcquireConnections(null);
CManagedComponentWrapperObj.ReinitializeMetaData();
CManagedComponentWrapperObj.ReleaseConnections();

  • You can attach multiple item to pipeline. 
  • To link the pipeline objects, you can use Attach Path method
IDTSPath100 path = dataFlow.PathCollection.New();
path.AttachPathAndPropagateNotifications(source.OutputCollection[0], target.InputCollection[0]);

  • To map the columns between pipeline items, you need to use the column collections
public void MapColumns(
IDTSComponentMetaData100 DestinationTask
, CManagedComponentWrapper InstanceDestination
, DTSUsageType dtsUsageType)
{
IDTSInput100 input = DestinationTask.InputCollection[0];
IDTSVirtualInput100 vInput = input.GetVirtualInput();
IDTSInputColumn100 vCol = null;

if (dtsUsageType == DTSUsageType.UT_READONLY)
{
foreach (IDTSVirtualInputColumn100 vColumn in  vInput.VirtualInputColumnCollection)
{
  InstanceDestination.SetUsageType(input.ID, vInput, vColumn.LineageID, dtsUsageType);
}

foreach (IDTSInputColumn100 col in input.InputColumnCollection)
{
IDTSExternalMetadataColumn100 exCol = input.ExternalMetadataColumnCollection[col.Name];
InstanceDestination.MapInputColumn(input.ID, col.ID, exCol.ID);
}
}
else
{
foreach (IDTSVirtualInputColumn100 vColumn in vInput.VirtualInputColumnCollection)
{
vCol = InstanceDestination.SetUsageType(input.ID, vInput, vColumn.LineageID, dtsUsageType);
IDTSExternalMetadataColumn100 exCol = input.ExternalMetadataColumnCollection[vColumn.Name];
InstanceDestination.MapInputColumn(input.ID, vCol.ID, exCol.ID);
                    }
                }
        }
  • For connecting executable in Control Tasks, you can use PrecedenceConstraint class to associate the Task for sequential execution.
PrecedenceConstraint prePC = DFT.ParentContainer.PrecedenceConstraints.Add(preExec, postExec);

Monday, December 26, 2011

SSMS Cache

The Microsoft SQL Server Management Studio is the Visual Studio for DB guys, with SQL Server 2008 R2, the IntelliSense feature in SSMS is awesome.  It saves time and the experience is cool.  Some time the IntelliSense will not work.  There can be several reason for this. 

  • You may be typing multiple statements with out proper batch separation (GO). 
  • The underlying schema is changed.

If it is schema change you can try refreshing the Cache, must of the time it helps.

Cache

Wednesday, December 21, 2011

Reading Dynamic SQL Result

Usage of dynamic SQL Statement is very common when want to do some meta data based operations.  Basically you create the dynamic statement using string concatenation and get it executed using EXEC or SP_EXECUTESQL.

The biggest challenge with dynamic SQL is the dynamic SQL are get executed in its context so you cannot pass variable, read variable seamlessly.  You need to pass the main context variable to dynamic sql context.

The following sample shows how you can pass and read value from dynamic SQL Statement.  For more detail check MSDN!

 

    DECLARE @RC INT
DECLARE @SourceDB SYSNAME
DECLARE @SourceTable SYSNAME
DECLARE @sql nVARCHAR(MAX)
DECLARE @TableName VARCHAR(MAX)

SET @TableName = 'Emp'

SELECT @SourceDB = SourceDatabase,
@SourceTable = SourceTable
FROM Metadata
WHERE DestinationTable = @TableName

SET @sql = 'SELECT @dynRC = COUNT(*)
FROM '
+ @SourceDB + '.dbo.' + @SourceTable

EXEC SP_EXECUTESQL @sql,N'@dynRC INT OUTPUT',
@dynRC = @RC OUTPUT

SELECT @RC

Friday, December 2, 2011

Dependent Objects

How many time you need to scan thru a stored procedure to find what are all the tables it uses.  Specially when you are working on a legacy application, it becomes most frequent task.

To simplify the work you can use SYS.SQL_Expression_Dependencies system table.

You can use the following CTE to get deeper object references

CREATE PROC [dbo].GetDependentObjectList @ObjectName AS sysname
AS
WITH ObjectDepends(entity_name,referenced_database_name,referenced_schema, referenced_entity, referenced_id,level)
AS
(
SELECT entity_name = CASE referencing_class
WHEN 1 THEN OBJECT_NAME(referencing_id)
WHEN 12 THEN (SELECT t.name FROM sys.triggers AS t
WHERE t.object_id = sed.referencing_id)
WHEN 13 THEN (SELECT st.name FROM sys.server_triggers AS st
WHERE st.object_id = sed.referencing_id) COLLATE database_default
END
,referenced_database_name
,referenced_schema_name
,referenced_entity_name
,referenced_id
,0 AS level
FROM SYS.SQL_Expression_Dependencies AS sed
WHERE OBJECT_NAME(referencing_id) = @ObjectName
UNION ALL
SELECT entity_name = CASE sed.referencing_class
WHEN 1 THEN OBJECT_NAME(sed.referencing_id)
WHEN 12 THEN (SELECT t.name FROM sys.triggers AS t
WHERE t.object_id = sed.referencing_id)
WHEN 13 THEN (SELECT st.name FROM sys.server_triggers AS st
WHERE st.object_id = sed.referencing_id) COLLATE database_default
END
,sed.referenced_database_name
,sed.referenced_schema_name
,sed.referenced_entity_name
,sed.referenced_id
,level + 1
FROM ObjectDepends AS o
JOIN SYS.SQL_Expression_Dependencies AS sed ON sed.referencing_id = o.referenced_id
)

SELECT entity_name AS referencing_entity, referenced_database_name,referenced_schema,referenced_entity, level
FROM ObjectDepends
ORDER BY level;

GO

Teams PowerShell

 The PowerShell helps to get quick meta data around Teams. Install-Module -Name MicrosoftTeams Connect-MicrosoftTeams Get-TeamAllChannel...