Tuesday, March 27, 2012
Filling out missing data in subsequent records?
I receive several datafiles from another system that are more or less in a Excel pivot table like format.
That is the first row representing the current data is fully filled, while subsequent rows, representing historic data are left partly empty.
Current rows and historic rows have different identifiers, e.g. rectype=0 or 1
Filling out the missing data on the historic record should be simple, if only all current rows would be filled.
Some current rows aren't filled, so the stuff like the following doesn't work:
update t1
set t1.colA =
(
select top 1 t2.colA
from mytable AS t2
where t2.rowid <= t1.rowid
and t2.cola <> 0
order by t2.rowid desc
)
from mytable AS t1
Somehow I need to check for the rectype, so I don't fill out rows with data from a previous entity
Any suggestions before I revert to using a cursor?
And while we are at it: I am in for an easy way to do this for all (about 60) colums in one move?
Before you ask:
After filling everything out we process the file to arrive at a few handy fromto tables, so we can use the correct data about the entity's status at a particular point in time elsewhere
I am using MS SQL Server 2005, and solutions are allowd to use any specific trickery that MSSQL allows.
Many thanks for any constructive thoughts
Cheers
DrioWould this do the job? My changes are highlighted
UPDATE t1
SET t1.colA =
(
SELECT TOP 1 t2.colA
FROM MyTable AS t2
WHERE t2.rowid < t1.rowid
AND t2.colA <> 0
ORDER BY t2.rowid DESC
)
FROM mytable AS t1
WHERE rectype = 0
I havn't tested this code - it's only in my head (/on the screen) so don't use it on your live data ;)|||Thnak you georgev:
for the tagline (I won't do it again; a real eye-opener)
for the small correction in my code and for the direction.
It goes wrong where there are history record after the current reccord with no data.Obvioulsy they get filled from the previous current record that hadd data.
My interim solution
1. update all current records wh data with a dummy value
2. use the fill out query
We then have to check the marked records and see if we can find a pattern that allows us the handle them in code (otherwise someone has to go through them manually; only .25% of total)
Thanks for you swift response
Cheers
Drio
Wednesday, March 7, 2012
File System Watcher can not read excel from SharePoint
I used file system watcher to read excel on my pc it worked fine, but when I tried to read the excel from SharePoint it did't work. The FileWatcher box showing the yellow color for long time that I had to stop the ssis.
So my question what is the cause of this. Do i need to set something or am i missing something? Please help.
Plase
|||What is the path you are using?
I think of sharepoint files as being accessed via http, and of course it will not work for that.
|||Have you try enabaling package logging and letting it run until it finish?
It would be ggod to see what is reported in log execution.
|||They can be referenced by UNC paths (like \\my.server.org\sites\MySite\Shared Documents\myfile.xls) but I believe that only works if you are on the same network as the server - not over the web.|||Thank you everybody. Sorry I used file Watcher Task, as provided on SQLIS.com, in SSIS and used UNC paths and http to connect SharePoint but it did not connect . Am i missing somethig? Since if I used the watcher task to connect to my local computer it worked.|||
AU.T wrote:
Thank you everybody. Sorry I used file Watcher Task, as provided on SQLIS.com, in SSIS and used UNC paths and http to connect SharePoint but it did not connect . Am i missing somethig? Since if I used the watcher task to connect to my local computer it worked.
Try making a connection using Execute Process to do a "net use" first - that will make sure that you can connect and have security permissions. I've also found that when using UNC paths to SharePoint, I usually have to make an initial connection to establish my credentials before I can access it programmatically.
File System Task Error - Process in use
In the integration services package i'm working on i connect to an excel database and have two data flows which deal with two work sheets in the excel spreadsheet. The data flows do some transformation and then write the data out to a SQL database (two differnet tables). This is all working great however what i want to do once both data flows are complete is to move the directory which contains the current file that is being looped through. At present there is a foreach loop going through a number of directories. I have tried to implement the File System Task object to move the directory but get the following error:
[File System Task] Error: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".
I've read a similar post where by the problem was due to not closing the excel connection before doing the File System Task. I cannot seem to find where i would be able to do this.
If anyone has any ideas i'm looking forward to hearing them.
Many thanks in advance,
Grant
i'm not sure what's causing your problem. however, my hunch is that a data flow task is still executing when control is passed to the file system task. you need make sure that both data flow tasks have completed before control is passed to the next task. one way to do this is to place both data flow tasks within a single sequence container, and then link this container to the next task. that way, control won't proceed to the next task until both data flow tasks have completed.
i hope this helps.
|||Hi Duane,
Thanks for the response. I had the same thought that the task hadn't released the file when it came to the moving of the directory. I have tried the suggestion of using the sequence container but i still get the same result. The move directory task still thinks it is being used by another process.
Any further suggestion would be welcomed. I thinking there must be a way to get round this or i'm sure it could be a common problem.
Many Thanks,
Grant
|||Hi Duane,Further to my last post. After some debugging i have found that it is the directory itself that seems to be in use by another process. I can remove the contents of the file no proble at all. Does this shed any more light on the matter?
Cheers,
Grant|||Hi,
I have found what i believe to be my problem and can't believe i didn't see it earlier. I'm doing all of my file and directory processing within a foreach loop. Hence when i come to moving the current directory i'm sure the foreach loop is still using the folder as it enumerated value. I need to be able to somehow call the move directory task outwith the loop so that it doesn't have control of the process. Could anyone suggest a way of doing this?
Thanks in advance.
Grant|||
if i understand you correctly, you want both data flows to finish before control proceeds to moving the files. also, you want this process to repeat once for each file that you have. i'm also assuming that you have two seperate directories of files that you want to process.
if my understanding above is correct, then you need to do the following: place two sequence containters inside a foreach loop container. connect an on success constraint from one sequence container to the other. there should be two seperate data flow tasks inside the first sequence container. only when this sequence container succeeds will control flow to the second sequence container. the second sequence container should contain two seperate file system tasks. only when this sequence container succeeds will control flow to the next iteration of the foreach loop container.
i hope this helps.
|||Hi,I'll clarify what i have set up at present.
There is a root directory in which a number of directories are stored. Within each of these directories are two files (one CSV and one XLS). We can ignore the CSV file just now. I have two data flow tasks, one which extracts transforms and loads data from one worksheet into a SQL table and another which does a similar process but from a different worksheet and to a different SQL table. Upon completion of both these tasks (which are now in a Sequence container) i want to move the directory containing both the files to an archive location. regardless of whether the file system tasks are in a sequence container there is still an error relating to the directory move. It still believes that the process is in use.
I still think that because the foreach loop is looping through the directories looking for each XLS file it is taking a hold on the directory itself and will not release it until the current directory until the next iteration of the loop.
With this thought in mind i thought that i could do something at the beginning of the loop to remove the directory that was referenced in the proevious iteration (possibly set a variable at the end of each iteration). The only problem i encouter when i try this is that because the variable has to be used as a source/destination value i cannot have this as a blank value when starting i get errors when this is attempted. I need to really skip the move directory on the first iteration and add an additional one at the end of the foreach loop.
My other thought is to just copy the directories during the loop and delete them after the loop has completed. I am aware however that using this method i run the risk that if something causes the package to crashh mid way through it might duplicate directories between the processing and archive directories.
Does this make any more sense.
Cheers,
Grant|||I have the problem resolved.
I have added the move directory task to the start of the foreach loop. A script bypasses the task on the first loop as there isn't a previous folder to archive. The archive folder path is assigned to the variable later in the loop. The next iteration the loop then has a folder to move in the variable which works no problem. After the loop is finished one final move task is added to deal with the final processed folder.
The reason it wouldn't allow an empty string as a connection source to begin with was because the "Delay Validation" property was set to false. If this is set to true the problem doesn't exist.
Thanks for all your help on this matter.
Grant
Sunday, February 26, 2012
File size big when download report
I download my report using SQL Server Reporting service export function
(excel format), The file size is very big, when compare to my previous asp
download function.
Note: The no of rec is same.
Ex: Using ASP, the file size is about 1000 kb
Using RS, the file size is about 23,000 kb.
Pls help me, what should I do to reduce the file size to normal.
Thanks
Regards
Kumar.Are you using SP1 version of RS? SP1 introduced excel using its native
store, while RTM used MHTML which was much larger.
--
-Daniel
This posting is provided "AS IS" with no warranties, and confers no rights.
"Kumar" <Kumar@.discussions.microsoft.com> wrote in message
news:5126C810-2785-4B32-A46C-6FB085FCAADD@.microsoft.com...
> Hi
> I download my report using SQL Server Reporting service export function
> (excel format), The file size is very big, when compare to my previous asp
> download function.
> Note: The no of rec is same.
> Ex: Using ASP, the file size is about 1000 kb
> Using RS, the file size is about 23,000 kb.
> Pls help me, what should I do to reduce the file size to normal.
> Thanks
> Regards
> Kumar.
>
Friday, February 24, 2012
File not found error when exporting
For some unknown reason all of a sudden the exporting of reports (to
PDF, Excel) cause a "File cannot be found error". This started to
happen last week after 12 months of error-free operation.
I've tried deleting the reports and re-deploying them - no success.
I would be most grateful someone could offer some insight.
Many thanks in advance,
SteveB.Does it happens from every workstation trying to retrieve the files ? Does
it happens thru the web interface or thru webservices access ?
Did you change IIS permissions or settings ?
"stevebuy@.eircom.net" wrote:
> I'm hoping someone else has had this problem and can help me.
> For some unknown reason all of a sudden the exporting of reports (to
> PDF, Excel) cause a "File cannot be found error". This started to
> happen last week after 12 months of error-free operation.
> I've tried deleting the reports and re-deploying them - no success.
> I would be most grateful someone could offer some insight.
> Many thanks in advance,
> SteveB.
>|||I've just cleared the temporary internet folder and the problem has
gone away on the local machine. There is still a problem with PDF
rendering subscriptions but I need to investigate it some more. Thanks
for taking the time to reply.
File Name to be changed
Hello,
I have developed a report based on a stored procedure which will generate an excel file. There is a requirement that this file needs to be sent to an external domain with the name changed accordingly i.e., a date is to be suffixed to the file after the name has been changed based on their naming convention.
I would appreciate, if anybody has done so and can give me some steps for me to work out.
TIA
Hello,
Take a look at data-driven subscriptions in Books Online. I believe this is the only way to update the filename without writing a custom app to handle it.
Hope this helps.
Jarret
|||Hello Jarret,
Thanks for your reply. I apologize if my information is not correct.
I would like to change the filename which is sent as an attachment in an email and not the filename as it is.
Thanks
Sunday, February 19, 2012
File Generation through the DTS
The first time it works great but every time after that the DTS would
append the data to a file and not overwrite it. I need to be
overwritten. I tried to replace the generated file with the empty one
but it doesn't put any date in it.
How can I have my DTS package overwrite the excel file as opposed to
append to it?
Thank you,
T.
I assume you are using the Export Data wizard to create the DTS
package. Start this process and choose an EXISTING spreadsheet. When
you get to the step where you select the table(s) to be exported, the
last column on the table line is headed Transform. Once you have
checked off the table to export, a button appears in that column;
click on that button. In the window that appears, on the Column
Mappings tab, there are options for Delete and Append. Choose Delete.
Be sure to Save the package.
Then you can either use the package it created, or inspect that
package to see how the Delete step was coded and copy that.
Roy Harvey
Beacon Falls, CT
On 8 Mar 2007 08:54:43 -0800, "tolcis" <nytollydba@.gmail.com> wrote:
>I have created a DTS package (SQL 2000) that generates an Excel file.
>The first time it works great but every time after that the DTS would
>append the data to a file and not overwrite it. I need to be
>overwritten. I tried to replace the generated file with the empty one
>but it doesn't put any date in it.
>How can I have my DTS package overwrite the excel file as opposed to
>append to it?
>Thank you,
>T.
|||On Mar 8, 12:18 pm, Roy Harvey <roy_har...@.snet.net> wrote:[vbcol=seagreen]
> I assume you are using the Export Data wizard to create the DTS
> package. Start this process and choose an EXISTING spreadsheet. When
> you get to the step where you select the table(s) to be exported, the
> last column on the table line is headed Transform. Once you have
> checked off the table to export, a button appears in that column;
> click on that button. In the window that appears, on the Column
> Mappings tab, there are options for Delete and Append. Choose Delete.
> Be sure to Save the package.
> Then you can either use the package it created, or inspect that
> package to see how the Delete step was coded and copy that.
> Roy Harvey
> Beacon Falls, CT
> On 8 Mar 2007 08:54:43 -0800, "tolcis" <nytolly...@.gmail.com> wrote:
>
I am using a query to generate a resultset. In DTS I have a
connection object and destination spreadsheet. In the mappings I
don't see any delete or append.
|||On 8 Mar 2007 10:48:19 -0800, "tolcis" <nytollydba@.gmail.com> wrote:
>I am using a query to generate a resultset. In DTS I have a
>connection object and destination spreadsheet. In the mappings I
>don't see any delete or append.
My example applied to the windows in the Wizard. I still suggest
following my example to see how the wizard codes the DELETE so that
you can copy the method in your DTS package.
Roy Harvey
Beacon Falls, CT
File Generation through the DTS
The first time it works great but every time after that the DTS would
append the data to a file and not overwrite it. I need to be
overwritten. I tried to replace the generated file with the empty one
but it doesn't put any date in it.
How can I have my DTS package overwrite the excel file as opposed to
append to it?
Thank you,
T.I assume you are using the Export Data wizard to create the DTS
package. Start this process and choose an EXISTING spreadsheet. When
you get to the step where you select the table(s) to be exported, the
last column on the table line is headed Transform. Once you have
checked off the table to export, a button appears in that column;
click on that button. In the window that appears, on the Column
Mappings tab, there are options for Delete and Append. Choose Delete.
Be sure to Save the package.
Then you can either use the package it created, or inspect that
package to see how the Delete step was coded and copy that.
Roy Harvey
Beacon Falls, CT
On 8 Mar 2007 08:54:43 -0800, "tolcis" <nytollydba@.gmail.com> wrote:
>I have created a DTS package (SQL 2000) that generates an Excel file.
>The first time it works great but every time after that the DTS would
>append the data to a file and not overwrite it. I need to be
>overwritten. I tried to replace the generated file with the empty one
>but it doesn't put any date in it.
>How can I have my DTS package overwrite the excel file as opposed to
>append to it?
>Thank you,
>T.|||On Mar 8, 12:18 pm, Roy Harvey <roy_har...@.snet.net> wrote:
> I assume you are using the Export Data wizard to create the DTS
> package. Start this process and choose an EXISTING spreadsheet. When
> you get to the step where you select the table(s) to be exported, the
> last column on the table line is headed Transform. Once you have
> checked off the table to export, a button appears in that column;
> click on that button. In the window that appears, on the Column
> Mappings tab, there are options for Delete and Append. Choose Delete.
> Be sure to Save the package.
> Then you can either use the package it created, or inspect that
> package to see how the Delete step was coded and copy that.
> Roy Harvey
> Beacon Falls, CT
> On 8 Mar 2007 08:54:43 -0800, "tolcis" <nytolly...@.gmail.com> wrote:
> >I have created a DTS package (SQL 2000) that generates an Excel file.
> >The first time it works great but every time after that the DTS would
> >append the data to a file and not overwrite it. I need to be
> >overwritten. I tried to replace the generated file with the empty one
> >but it doesn't put any date in it.
> >How can I have my DTS package overwrite the excel file as opposed to
> >append to it?
> >Thank you,
> >T.
I am using a query to generate a resultset. In DTS I have a
connection object and destination spreadsheet. In the mappings I
don't see any delete or append.|||On 8 Mar 2007 10:48:19 -0800, "tolcis" <nytollydba@.gmail.com> wrote:
>I am using a query to generate a resultset. In DTS I have a
>connection object and destination spreadsheet. In the mappings I
>don't see any delete or append.
My example applied to the windows in the Wizard. I still suggest
following my example to see how the wizard codes the DELETE so that
you can copy the method in your DTS package.
Roy Harvey
Beacon Falls, CT