From: ben brugman on
Hello All,

From a stored procedure I would like to call an SSIS package.

I have two packages:
1. Does an FTP call to a legacy system to collect information and put this in a local file as .csv.
2. Process a number of local .csv files, get the information in a import table then process the import table.

I have tried to use xp_cmdshell, from the same machine, the first SSIS package does not work, the second did work.
So I replaced the first package with a ftp batch which does the same job.

But when deploying both packages to another system both SSIS packages fail. (The ftp batch job does work).

I have tried to use:
sp_xp_cmdshell_proxy_account
But am under the impression that this did not influence the account which is used by the xp_cmdshell.
And can't get this to work.

I was advised to use the SQL storage for SSIS, but then somebody advised against SQL-storage, because you could (??) not remove a SSIS package from SQL-storage. (Is this true ?) But the advise was use the SSIS storage. I think I should build a job with a SSIS package (from SSIS storage) in it and start this job using SP_start_job.

So my questions are:
1. What is a good practise using SSIS and starting this from a stored procedure ?
2. How to build a Job which starts an SSIS package from SQL-storage?
3. How to start this job form a stored procedure (sp_start_job)?

Further I use two parameters, these parameters give the database resource where to read the 'other' parameters, one parameter for the SQL-server, one parameter for the database. Can I use these parameters in the job ?

Please give me some advise, because I am stuck with most methods I tried to use.


Thanks for your time and attention,
Ben Brugman


From: John Bell on
On Fri, 18 Jun 2010 15:31:38 +0200, "ben brugman" <ben(a)niethier.nl>
wrote:

>Hello All,
>
>From a stored procedure I would like to call an SSIS package.
>
>I have two packages:
>1. Does an FTP call to a legacy system to collect information and put this in a local file as .csv.
>2. Process a number of local .csv files, get the information in a import table then process the import table.
>
>I have tried to use xp_cmdshell, from the same machine, the first SSIS package does not work, the second did work.
>So I replaced the first package with a ftp batch which does the same job.
>
>But when deploying both packages to another system both SSIS packages fail. (The ftp batch job does work).
>
>I have tried to use:
>sp_xp_cmdshell_proxy_account
>But am under the impression that this did not influence the account which is used by the xp_cmdshell.
>And can't get this to work.
>
>I was advised to use the SQL storage for SSIS, but then somebody advised against SQL-storage, because you could (??) not remove a SSIS package from SQL-storage. (Is this true ?) But the advise was use the SSIS storage. I think I should build a job with a SSIS package (from SSIS storage) in it and start this job using SP_start_job.
>
>So my questions are:
>1. What is a good practise using SSIS and starting this from a stored procedure ?
>2. How to build a Job which starts an SSIS package from SQL-storage?
>3. How to start this job form a stored procedure (sp_start_job)?
>
>Further I use two parameters, these parameters give the database resource where to read the 'other' parameters, one parameter for the SQL-server, one parameter for the database. Can I use these parameters in the job ?
>
>Please give me some advise, because I am stuck with most methods I tried to use.
>
>
>Thanks for your time and attention,
>Ben Brugman
>

Hi

I assume you are currently using DTEXEC and xp_cmdshell, which was the
way to you had to do this in earlier versions of SQL Server.

You can run them as a spefic job step without calling a procedure,
just change the job type to "SQL Server Integration Services Package"
when you create the package.

I suspect you have a permissions issue, you can prove that by running
the package from a command prompt. If that works then try changing the
SQL server agent service account to be a domain account.

John