• Home
        • Database Development

          Database development is designing, creating a database or data model, and analyzing requirements and their intents as raw data.

          Learn More
        • Architecture & Design

          Software architecture refers to the fundamental structures of a software system and the discipline of creating such structures and systems.

          Learn More
        • Programming

          Computer programming is the process of performing a particular computation or more generally, accomplishing a specific result.

          Learn More
        • Cloud Computing

          Cloud computing is the on-demand availability of computer system resources, especially data storage and computing power.

          Learn More
        • ETL Development

          ETL provides the foundation for data analytics and machine learning workstreams. Through a series of business rules, ETL cleanses and organizes data.

          Learn More
        • Data Visualization & Reports

          Data and information visualization is an interdisciplinary field that deals with the graphic representation of data and information.

          Learn More
  • Blog
  • Contact

Programmatically Create Data Flow Task with Row Count Transformation in SSIS Package Using C#

Abstract
This article explains how to programmatically create a data flow task with row count transformation in SQL Server Integration Services Package Using C#.
Requirements
Article
If the above requirements are all met, we will begin by launching Microsoft Visual Studio 2008.
Create a new project Integration Services Project which is located under Business Intelligence Projects.
After you have named the new project, proceed to click and drag the script task in the Control Flow pane of the new package.
Right-click the script task and click on “Edit”
Under the Script Task Editor change the “ScriptLanguage” to “Microsoft Visual C# 2008”.
In Project Explorer import relevant references and ensure that you have declared namespaces as below:
After declarations, create an instance of the application and package:
Insert Data Flow Task:
Insert your connection:
The rest of the code should look as follows:
Variable Counter = p.Variables.Add(“var_counter”, false, “User”, 0);
IDTSComponentMetaData100 dataConvertComponent = dataFlowTask.ComponentMetaDataCollection.New();
dataConvertComponent.ComponentClassID = “DTSTransform.RowCount”;
dataConvertComponent.Name = “get row counter”;
dataConvertComponent.Description = “gets row counter”;
CManagedComponentWrapper dataConvertWrapper = dataConvertComponent.Instantiate();
dataConvertWrapper.ProvideComponentProperties();
dataConvertWrapper.SetComponentProperty(“VariableName”, “User::var_counter”);
// Connect the source and the transform
dataFlowTask.PathCollection.New().AttachPathAndPropagateNotifications(component.OutputCollection[0],
dataConvertComponent.InputCollection[0]);
IDTSComponentMetaData100 destination =
dataFlowTask.ComponentMetaDataCollection.New();
destination.ComponentClassID = “DTSAdapter.OleDbDestination”;
destination.Name = “OLEDBDestination”;
CManagedComponentWrapper destDesignTime = destination.Instantiate();
destDesignTime.ProvideComponentProperties();
destination.RuntimeConnectionCollection[0].ConnectionManagerID = cm_DES.ID;
destination.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(cm_DES);
destDesignTime.SetComponentProperty(“OpenRowset”, “your_destination_table”);
destDesignTime.SetComponentProperty(“AccessMode”, 3);
destDesignTime.SetComponentProperty(“FastLoadOptions”, “TABLOCK,CHECK_CONSTRAINTS”);
IDTSPath100 path = dataFlowTask.PathCollection.New();
path.AttachPathAndPropagateNotifications(dataConvertComponent.OutputCollection[0], destination.InputCollection[0]);
IDTSInput100 destinationInputerr = destination.InputCollection[0];
IDTSVirtualInput100 destinationVirtualInputerr = destinationInputerr.GetVirtualInput();
IDTSVirtualInputColumnCollection100 destinationVirtualInputColumnserr =
destinationVirtualInputerr.VirtualInputColumnCollection;
destDesignTime.AcquireConnections(null);
destDesignTime.ReinitializeMetaData();
destDesignTime.ReleaseConnections();
IDTSInput100 input = destination.InputCollection[0];
IDTSVirtualInput100 vInput = input.GetVirtualInput();
foreach (IDTSVirtualInputColumn100 vColumn in vInput.VirtualInputColumnCollection)
{
// Select column, and retain new input column
IDTSInputColumn100 inputColumn = destDesignTime.SetUsageType(input.ID,
vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
// Find external column by name
IDTSExternalMetadataColumn100 externalColumn =
input.ExternalMetadataColumnCollection[inputColumn.Name];
// Map input column to external column
destDesignTime.MapInputColumn(input.ID, inputColumn.ID, externalColumn.ID);
}
ConnectionManager ConMgr = p.Connections.Add(“OLEDB”);
ConMgr.ConnectionString = string.Format(
“Provider=SQLOLEDB.1;Data Source={0};Initial Catalog={1};Integrated Security=SSPI;”, “your_server”, “your_database”);
ConMgr.Name = “ConMgr”;
Executable exec = p.Executables.Add(“STOCK:SQLTask”);
TaskHost th = exec as TaskHost;
th.Properties[“Name”].SetValue(th, “insert row count”);
th.Properties[“Description”].SetValue(th, “insert row count”);
th.SetExpression(“SqlStatementSource”, “\”insert into your_table_name values (‘\” + (DT_WSTR, 100) @[User::var_counter] +\”‘)\””);
th.SetExpression(“Connection”, “\ConMgr\”);
Conclusion
It’s that simple!
You can now execute your script task and the package will be created in the location you specified.

Loading

Sifiso

October 13, 2022
Sifiso is Data Architect and Technical Lead at SELECT SIFISO – a technology consulting firm focusing on cloud migrations, data ingestion, DevOps, reporting and analytics. Sifiso has over 15 years of across private and public business sectors, helping businesses implement Microsoft, AWS and open-source technology solutions. He is the member of the Johannesburg SQL User Group and also hold a Master’s Degree in MCom IT Management from the University of Johannesburg.

Meet Our Experts

We are proud to have a team of experts who are passionate about delivering the best possible solutions to you. Our team members are highly skilled and experienced in a diverse range of IT technologies, and are committed to staying up-to-date with the latest industry trends and best practices to deliver you the best results.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Join Our Newsletter

Subscribe to get our latest and best thinking on the most definitive workforce topics affecting HR leaders and organizations today.