Monday, March 12, 2012
Filegroup is full error
This is a server post even though I mention Analysis Services!
I am getting an error as follows when I try to process a cube:
Data source provider error: Could not allocate space for object '(SYSTEM
table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup is
full.;42000; Time:16/11/2004 7:05:34 PM
It looks like a SQL Server problem rather than AS.
The TEMPDB data file and transaction log are set on automatically grow and
unrestricted growth, and there is plenty of overall disk space (it's on a
RAID system).
Any ideas as to what the problem might be? Does the number 42000 mean
anything? A bad stripe has been suggested (i.e. hardware). I am going to
try adding another file to the PRIMARY filegroup but beyond that I am a bit
lost.
Hope someone can help.
LesSometimes it can not grow fast enough. Don't rely on autogrow if you know
you need more space. Manually grow the file(s) in that filegroup and try
again.
--
Andrew J. Kelly SQL MVP
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:8157C0BC-32E4-4C9C-85AF-50D8683A829B@.microsoft.com...
> Hello,
> This is a server post even though I mention Analysis Services!
> I am getting an error as follows when I try to process a cube:
> Data source provider error: Could not allocate space for object '(SYSTEM
> table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup
> is
> full.;42000; Time:16/11/2004 7:05:34 PM
> It looks like a SQL Server problem rather than AS.
> The TEMPDB data file and transaction log are set on automatically grow and
> unrestricted growth, and there is plenty of overall disk space (it's on a
> RAID system).
> Any ideas as to what the problem might be? Does the number 42000 mean
> anything? A bad stripe has been suggested (i.e. hardware). I am going to
> try adding another file to the PRIMARY filegroup but beyond that I am a
> bit
> lost.
> Hope someone can help.
> Les|||I had the same thing a while back if Im not mistaken. What it came out to be
was that tempdb couldnt grow fast enough to keep up with the demands. I
increased the size to 100 megs and havent had the problem since.
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:8157C0BC-32E4-4C9C-85AF-50D8683A829B@.microsoft.com...
> Hello,
> This is a server post even though I mention Analysis Services!
> I am getting an error as follows when I try to process a cube:
> Data source provider error: Could not allocate space for object '(SYSTEM
> table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup
is
> full.;42000; Time:16/11/2004 7:05:34 PM
> It looks like a SQL Server problem rather than AS.
> The TEMPDB data file and transaction log are set on automatically grow and
> unrestricted growth, and there is plenty of overall disk space (it's on a
> RAID system).
> Any ideas as to what the problem might be? Does the number 42000 mean
> anything? A bad stripe has been suggested (i.e. hardware). I am going to
> try adding another file to the PRIMARY filegroup but beyond that I am a
bit
> lost.
> Hope someone can help.
> Les|||Andrew,
Thanks to you and ChrisR. I think you are right.
I have redesigned the cube so that it is smaller, and it is OK now, but I
will check up how to grow the file manually for future reference.
Les
Filegroup is full error
This is a server post even though I mention Analysis Services!
I am getting an error as follows when I try to process a cube:
Data source provider error: Could not allocate space for object '(SYSTEM
table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup i
s
full.;42000; Time:16/11/2004 7:05:34 PM
It looks like a SQL Server problem rather than AS.
The TEMPDB data file and transaction log are set on automatically grow and
unrestricted growth, and there is plenty of overall disk space (it's on a
RAID system).
Any ideas as to what the problem might be? Does the number 42000 mean
anything? A bad stripe has been suggested (i.e. hardware). I am going to
try adding another file to the PRIMARY filegroup but beyond that I am a bit
lost.
Hope someone can help.
LesSometimes it can not grow fast enough. Don't rely on autogrow if you know
you need more space. Manually grow the file(s) in that filegroup and try
again.
Andrew J. Kelly SQL MVP
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:8157C0BC-32E4-4C9C-85AF-50D8683A829B@.microsoft.com...
> Hello,
> This is a server post even though I mention Analysis Services!
> I am getting an error as follows when I try to process a cube:
> Data source provider error: Could not allocate space for object '(SYSTEM
> table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup
> is
> full.;42000; Time:16/11/2004 7:05:34 PM
> It looks like a SQL Server problem rather than AS.
> The TEMPDB data file and transaction log are set on automatically grow and
> unrestricted growth, and there is plenty of overall disk space (it's on a
> RAID system).
> Any ideas as to what the problem might be? Does the number 42000 mean
> anything? A bad stripe has been suggested (i.e. hardware). I am going to
> try adding another file to the PRIMARY filegroup but beyond that I am a
> bit
> lost.
> Hope someone can help.
> Les|||I had the same thing a while back if Im not mistaken. What it came out to be
was that tempdb couldnt grow fast enough to keep up with the demands. I
increased the size to 100 megs and havent had the problem since.
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:8157C0BC-32E4-4C9C-85AF-50D8683A829B@.microsoft.com...
> Hello,
> This is a server post even though I mention Analysis Services!
> I am getting an error as follows when I try to process a cube:
> Data source provider error: Could not allocate space for object '(SYSTEM
> table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup
is
> full.;42000; Time:16/11/2004 7:05:34 PM
> It looks like a SQL Server problem rather than AS.
> The TEMPDB data file and transaction log are set on automatically grow and
> unrestricted growth, and there is plenty of overall disk space (it's on a
> RAID system).
> Any ideas as to what the problem might be? Does the number 42000 mean
> anything? A bad stripe has been suggested (i.e. hardware). I am going to
> try adding another file to the PRIMARY filegroup but beyond that I am a
bit
> lost.
> Hope someone can help.
> Les|||Andrew,
Thanks to you and ChrisR. I think you are right.
I have redesigned the cube so that it is smaller, and it is OK now, but I
will check up how to grow the file manually for future reference.
Les
Filegroup is full error
This is a server post even though I mention Analysis Services!
I am getting an error as follows when I try to process a cube:
Data source provider error: Could not allocate space for object '(SYSTEM
table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup is
full.;42000; Time:16/11/2004 7:05:34 PM
It looks like a SQL Server problem rather than AS.
The TEMPDB data file and transaction log are set on automatically grow and
unrestricted growth, and there is plenty of overall disk space (it's on a
RAID system).
Any ideas as to what the problem might be? Does the number 42000 mean
anything? A bad stripe has been suggested (i.e. hardware). I am going to
try adding another file to the PRIMARY filegroup but beyond that I am a bit
lost.
Hope someone can help.
Les
Sometimes it can not grow fast enough. Don't rely on autogrow if you know
you need more space. Manually grow the file(s) in that filegroup and try
again.
Andrew J. Kelly SQL MVP
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:8157C0BC-32E4-4C9C-85AF-50D8683A829B@.microsoft.com...
> Hello,
> This is a server post even though I mention Analysis Services!
> I am getting an error as follows when I try to process a cube:
> Data source provider error: Could not allocate space for object '(SYSTEM
> table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup
> is
> full.;42000; Time:16/11/2004 7:05:34 PM
> It looks like a SQL Server problem rather than AS.
> The TEMPDB data file and transaction log are set on automatically grow and
> unrestricted growth, and there is plenty of overall disk space (it's on a
> RAID system).
> Any ideas as to what the problem might be? Does the number 42000 mean
> anything? A bad stripe has been suggested (i.e. hardware). I am going to
> try adding another file to the PRIMARY filegroup but beyond that I am a
> bit
> lost.
> Hope someone can help.
> Les
|||I had the same thing a while back if Im not mistaken. What it came out to be
was that tempdb couldnt grow fast enough to keep up with the demands. I
increased the size to 100 megs and havent had the problem since.
"Les Russell" <LesRussell@.discussions.microsoft.com> wrote in message
news:8157C0BC-32E4-4C9C-85AF-50D8683A829B@.microsoft.com...
> Hello,
> This is a server post even though I mention Analysis Services!
> I am getting an error as follows when I try to process a cube:
> Data source provider error: Could not allocate space for object '(SYSTEM
> table id: -76163232)' in database 'TEMPDB' because the 'DEFAULT' filegroup
is
> full.;42000; Time:16/11/2004 7:05:34 PM
> It looks like a SQL Server problem rather than AS.
> The TEMPDB data file and transaction log are set on automatically grow and
> unrestricted growth, and there is plenty of overall disk space (it's on a
> RAID system).
> Any ideas as to what the problem might be? Does the number 42000 mean
> anything? A bad stripe has been suggested (i.e. hardware). I am going to
> try adding another file to the PRIMARY filegroup but beyond that I am a
bit
> lost.
> Hope someone can help.
> Les
|||Andrew,
Thanks to you and ChrisR. I think you are right.
I have redesigned the cube so that it is smaller, and it is OK now, but I
will check up how to grow the file manually for future reference.
Les
Wednesday, March 7, 2012
File System Task Problem
I'm trying to realize a file system task that rename files from a
foreach loop container. So that means the task have a variable in the
source connection. This variable got the value (as an expression) of
"c:\\.....\\"+@.[User::ForeachloopVar].
But an error message appears when i run it.The message is
File System Task: An error occurred with the following error message:
"The given path's format is not supported."
When i don′t use the variable in the source connection it works fine.
Anyone knows what might be the problem?
Thanks.
Escape the slashes to be "\\"
BTW - you may find this blog post from Kirk Haselden useful ... http://sqljunkies.com/WebLog/knight_reign/archive/2005/02/12/7750.aspx
Donald
|||I′m sory but that′s not the problem.Because i′ve just wrote the example whitout the slashes but they′re there. And i′ve already looked at that blog. It was helpfull to create my package. But thats the thing in his case the value of the source variable from the foreach loop appears in the value of the expression, mine doesn′t.|||I see that you are creating a path and appending the source variable in an expression. However, the ForEach loop includes the option to return the fully qualified name of the file. Is there a reason you're not using that?
Donald
|||I′m using the fully qualified name option that's not the problem.|||It′s working now i did some changes like using only name and extension option on the foreach an voilá.
thanks.
|||That link was quite a nice example, but I seem to be having a bit of a novice problem. I've followed the blog virtually to the letter, even changing to the variable names used, but on the File System Task I get an error saying that ' FileSourcePath is used as a source or destination and is empty '.
I created the ForEach loop container as was described:
On the collection I have set the path (browsed to the path, so it does exist). I've specified files as '*.txt' - there will be 2 files in the directory by the time the container executes. In the variable mappings I have specified the user::FileSourcePath variable - it has an index value of 0. The variables have a scope of the entire package. I feel I've missed something obvious here, but somehow my container does not update the values for that variable - any help?
|||Look i was having that same problem before.I did just like the blog shows and it alway fails. I don't know why because it seems to work in the blog. But i don't know if my solution was the best one but i created another variable to contain the flat file source path and and in the foreach loop i've chose the option on the collection pane on the retrieve file name -Name and extension. That means i'm olnly getting the *.txt for the variable @.[user::FileSourcePath]. And then i've put the new variable whit the @.[user::FileSourcePath] in the Connection Manager connectionstring. So then i've just wrote the following on the evaluate as expression of the user::FileDestinationPath
@.[User::FileDestinationFolder] + "\\" +(DT_WSTR, 10)(DT_DBdate)GETDATE()+ @.[User::FileDestinationPath]
and it rename the file just like i wanted and put the date before the old name.
Look i'm new at this thing to so i don't know if i've explained well. But i hope that you can use something.
|||If you receive an error when the package starts saying that a source or destination placeholder variable is empty you can typically fix it this way: simply ensure that the default (initial) value of the variable is set to a filename.
Setting DelayedValidation on the loop may also help in your case.
Donald
File System Task - Moving a File
I have a File System Task that uses variables to resolve the destination and source paths of a document. When I select the 'copyfile' operation...the document is copied from the source to the destination without error.
However when I change the property from 'copyfile' to 'movefile' I get an error and the document is not moved.
The source and destination variables contain a valid file path name since the copy commmand is working as expected. However when I alter the properties of the File System task to move the document. I get the following error:
Could not find a part of the path 'G:\Common\Information Systems\DropFiles\nrt\NRT_Confirmation\Order Confirmation Report_11062006.xls\Order Confirmation Report_11062006.xls
It seems a little nonsensical since the document file paths are valid when performing a copy. For some reason the error log is showing that that the file path is the document name "Order Confirmation Report_11062006.xls" and adding it twice to the the directory path called "'G:\Common\Information Systems\DropFiles\nrt\NRT_Confirmation\" as you can see in the above error message.
To replicate the 'move' action...I added an extra File System Task that deletes the document once the copy has been performed. I would like some insight into why this doesn't seem to work.
Thank you.
...cordell...
Do no specify the file name for the Target.. in the Usage Type of File Connection properties specify it as Existing Folder instead of file and just select the folder path as you are moving an existing file.File System Task - Moving a File
I have a File System Task that uses variables to resolve the destination and source paths of a document. When I select the 'copyfile' operation...the document is copied from the source to the destination without error.
However when I change the property from 'copyfile' to 'movefile' I get an error and the document is not moved.
The source and destination variables contain a valid file path name since the copy commmand is working as expected. However when I alter the properties of the File System task to move the document. I get the following error:
Could not find a part of the path 'G:\Common\Information Systems\DropFiles\nrt\NRT_Confirmation\Order Confirmation Report_11062006.xls\Order Confirmation Report_11062006.xls
It seems a little nonsensical since the document file paths are valid when performing a copy. For some reason the error log is showing that that the file path is the document name "Order Confirmation Report_11062006.xls" and adding it twice to the the directory path called "'G:\Common\Information Systems\DropFiles\nrt\NRT_Confirmation\" as you can see in the above error message.
To replicate the 'move' action...I added an extra File System Task that deletes the document once the copy has been performed. I would like some insight into why this doesn't seem to work.
Thank you.
...cordell...
Do no specify the file name for the Target.. in the Usage Type of File Connection properties specify it as Existing Folder instead of file and just select the folder path as you are moving an existing file.File system error: A FileStore error from WriteFile occurred
Error: Code: 0x00000000
Source: SRC Process dimensions Error:
Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Errors in the OLAP storage engine: An error occurred while the <attribute id> attr
ibute' attribute of the <dimensionname> dimension from the '<database>' database was being processed.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
End Error
I encountered this while incrementally processing a dimension via AMO script in SSIS. Does anyone have ideas on what this error means and ways to avoid?
I unprocessed and reprocessed to resolve this.|||What do you mean by "Unprocess a cube"?
How did you unprocess a cube?
Thanks.
Chris
|||Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Yikes. This has happened to me a few times within the cubes and dimensions. I also get the writefile error, which appears to be a bad reference to a file that no longer exists. I tried to use other processing methods ot process the dimension that was broken, and now I don't even get the errror anymore. Now the processing dialog box just hangs, saying that it is building a processing schedule. I do not want to unprocess the dimension, because I will have to reprocess the entire database. Does anyone have a clue as to how to avoid this, or why it happens? How could one fix it without unprocessing?
Thank,
Sally
|||That's a while ago. My problem was actually caused by the size of a dimension that I was processing.
It was just too big. Later I reduced the size, and the error disappeared.
Thanks anyway.
Chris
|||I have been experimenting with large dimensions (or at least what we consider to be large - ~100mm members), and have learned the string store (filename ends in .asstore) cannot exceed 4 gig. Is this the same thing you have run into, or have you found another limitation?
John
File system error: A FileStore error from WriteFile occurred
Error: Code: 0x00000000
Source: SRC Process dimensions Error:
Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Errors in the OLAP storage engine: An error occurred while the <attribute id> attr
ibute' attribute of the <dimensionname> dimension from the '<database>' database was being processed.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
End Error
I encountered this while incrementally processing a dimension via AMO script in SSIS. Does anyone have ideas on what this error means and ways to avoid?
I unprocessed and reprocessed to resolve this.|||What do you mean by "Unprocess a cube"?
How did you unprocess a cube?
Thanks.
Chris
|||Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Yikes. This has happened to me a few times within the cubes and dimensions. I also get the writefile error, which appears to be a bad reference to a file that no longer exists. I tried to use other processing methods ot process the dimension that was broken, and now I don't even get the errror anymore. Now the processing dialog box just hangs, saying that it is building a processing schedule. I do not want to unprocess the dimension, because I will have to reprocess the entire database. Does anyone have a clue as to how to avoid this, or why it happens? How could one fix it without unprocessing?
Thank,
Sally
|||That's a while ago. My problem was actually caused by the size of a dimension that I was processing.
It was just too big. Later I reduced the size, and the error disappeared.
Thanks anyway.
Chris
|||I have been experimenting with large dimensions (or at least what we consider to be large - ~100mm members), and have learned the string store (filename ends in .asstore) cannot exceed 4 gig. Is this the same thing you have run into, or have you found another limitation?
John
File system error: A FileStore error from WriteFile occurred
Error: Code: 0x00000000
Source: SRC Process dimensions Error:
Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Errors in the OLAP storage engine: An error occurred while the <attribute id> attr
ibute' attribute of the <dimensionname> dimension from the '<database>' database was being processed.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
End Error
I encountered this while incrementally processing a dimension via AMO script in SSIS. Does anyone have ideas on what this error means and ways to avoid?
I unprocessed and reprocessed to resolve this.|||What do you mean by "Unprocess a cube"?
How did you unprocess a cube?
Thanks.
Chris
|||Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Yikes. This has happened to me a few times within the cubes and dimensions. I also get the writefile error, which appears to be a bad reference to a file that no longer exists. I tried to use other processing methods ot process the dimension that was broken, and now I don't even get the errror anymore. Now the processing dialog box just hangs, saying that it is building a processing schedule. I do not want to unprocess the dimension, because I will have to reprocess the entire database. Does anyone have a clue as to how to avoid this, or why it happens? How could one fix it without unprocessing?
Thank,
Sally
|||That's a while ago. My problem was actually caused by the size of a dimension that I was processing.
It was just too big. Later I reduced the size, and the error disappeared.
Thanks anyway.
Chris
|||I have been experimenting with large dimensions (or at least what we consider to be large - ~100mm members), and have learned the string store (filename ends in .asstore) cannot exceed 4 gig. Is this the same thing you have run into, or have you found another limitation?
John
File system error: A FileStore error from WriteFile occurred
Error: Code: 0x00000000
Source: SRC Process dimensions Error:
Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Errors in the OLAP storage engine: An error occurred while the <attribute id> attr
ibute' attribute of the <dimensionname> dimension from the '<database>' database was being processed.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
End Error
I encountered this while incrementally processing a dimension via AMO script in SSIS. Does anyone have ideas on what this error means and ways to avoid?
I unprocessed and reprocessed to resolve this.|||What do you mean by "Unprocess a cube"?
How did you unprocess a cube?
Thanks.
Chris
|||Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Yikes. This has happened to me a few times within the cubes and dimensions. I also get the writefile error, which appears to be a bad reference to a file that no longer exists. I tried to use other processing methods ot process the dimension that was broken, and now I don't even get the errror anymore. Now the processing dialog box just hangs, saying that it is building a processing schedule. I do not want to unprocess the dimension, because I will have to reprocess the entire database. Does anyone have a clue as to how to avoid this, or why it happens? How could one fix it without unprocessing?
Thank,
Sally
|||That's a while ago. My problem was actually caused by the size of a dimension that I was processing.
It was just too big. Later I reduced the size, and the error disappeared.
Thanks anyway.
Chris
|||I have been experimenting with large dimensions (or at least what we consider to be large - ~100mm members), and have learned the string store (filename ends in .asstore) cannot exceed 4 gig. Is this the same thing you have run into, or have you found another limitation?
John
File system error: A FileStore error from WriteFile occurred
Error: Code: 0x00000000
Source: SRC Process dimensions Error:
Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Errors in the OLAP storage engine: An error occurred while the <attribute id> attr
ibute' attribute of the <dimensionname> dimension from the '<database>' database was being processed.
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation.
End Error
I encountered this while incrementally processing a dimension via AMO script in SSIS. Does anyone have ideas on what this error means and ways to avoid?
I unprocessed and reprocessed to resolve this.|||What do you mean by "Unprocess a cube"?
How did you unprocess a cube?
Thanks.
Chris
|||Description: File system error: A FileStore error from WriteFile occurred. Physical file: \\?\<pathto filename> attribute.asstore. Logical file: . .
Yikes. This has happened to me a few times within the cubes and dimensions. I also get the writefile error, which appears to be a bad reference to a file that no longer exists. I tried to use other processing methods ot process the dimension that was broken, and now I don't even get the errror anymore. Now the processing dialog box just hangs, saying that it is building a processing schedule. I do not want to unprocess the dimension, because I will have to reprocess the entire database. Does anyone have a clue as to how to avoid this, or why it happens? How could one fix it without unprocessing?
Thank,
Sally
|||That's a while ago. My problem was actually caused by the size of a dimension that I was processing.
It was just too big. Later I reduced the size, and the error disappeared.
Thanks anyway.
Chris
|||I have been experimenting with large dimensions (or at least what we consider to be large - ~100mm members), and have learned the string store (filename ends in .asstore) cannot exceed 4 gig. Is this the same thing you have run into, or have you found another limitation?
John
Sunday, February 26, 2012
File source error : so much rows
I have a problem with my SSIS. I have a data flow with a file source in csv, but itself has 140 000 rows, so when I execute the date flow, I have a error who say that the data exceed the temp of I/O (sorry for the translate, but I have the message in french).
I test to pass the DefaultBufferMaxRow to 140000 but I have always the problem.
If we can help me, thank you.
You might try setting the BufferTempStoragePath property on the data flow to point to a drive with plenty of free space. If you search on the forum for that property, you'll find more information.|||Ok, I go to search it.
Thank you for your answer.
File source error : so much rows
I have a problem with my SSIS. I have a data flow with a file source in csv, but itself has 140 000 rows, so when I execute the date flow, I have a error who say that the data exceed the temp of I/O (sorry for the translate, but I have the message in french).
I test to pass the DefaultBufferMaxRow to 140000 but I have always the problem.
If we can help me, thank you.
You might try setting the BufferTempStoragePath property on the data flow to point to a drive with plenty of free space. If you search on the forum for that property, you'll find more information.|||Ok, I go to search it.
Thank you for your answer.
Friday, February 24, 2012
file name or extension is too long
Hi All,
I am receiving the following message when I run Dts package which is
creating a cube.
Error Source : Microsoft Data transmission Services (DTS)
Package
Error Description: File name or extension is too long
Any help will be appreciated!
Thanks in advance,
Mohammed Sarwar
Ocp dba oracle 9i,8i,8.0
*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!Mohammed Sarwar <msarwar@.ubid.com> wrote in message news:<3fb3f814$0$193$75868355@.news.frii.net>...
> Hi All,
> I am receiving the following message when I run Dts package which is
> creating a cube.
> Error Source : Microsoft Data transmission Services (DTS)
> Package
> Error Description: File name or extension is too long
> Any help will be appreciated!
> Thanks in advance,
> Mohammed Sarwar
> Ocp dba oracle 9i,8i,8.0
>
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!
You didn't mention which version of SQL Server you have, but if it's
7, then this KB article may apply to you:
http://support.microsoft.com/defaul...kb;en-us;243545
If the article isn't helpful, then you should post some more
information - your version of SQL Server, which step the package is
failing on etc. You might also want to post to
microsoft.public.sqlserver.dts.
Simon
File Manipulation from MS SQL Server
Thanks in advance!
Daniel
Austin, TexasTry looking into xp_cmdshell stored proc in BOL. This will allow you to use DOS commands from a TSQL script. We had to use this stored proc to rename delimited files we created from a DTS to be exported to a marketing company. Works pretty good.
HTH
DMW|||Hi DMW,
Thank you very much! I will give this a try.
Sincerely,
Daniel
Austin, Texas|||USE Northwind
GO
SET NOCOUNT ON
CREATE TABLE myTable99(RowNum int IDENTITY(1,1), Data varchar(8000))
GO
INSERT INTO myTable99(Data) EXEC master..xp_cmdshell 'Dir C:\*.*'
SELECT * FROM myTable99
GO
SET NOCOUNT OFF
DROP TABLE myTable99
GO|||I use sp_oaxxx with FileSystemObject, because xp_cmdshell will require for a user to have privileges on the target file system.
Sunday, February 19, 2012
file import to SQL Server 2000
I have a file with an extension of .sdf. I "believe" it is a text file
of some sort but I am uncertain. The source agency hasn't returned any
of my calls so I'm wondering if anyone is familiar with this extension?
I'd like to import the file into my database - when I use DTS and chose
a text format, regardless of what delimiter I choose, the format is
still really ugly. when I pull it up in a huge text editor, it is hard
for me to tell what it is there.
I saw in one of my searches that it could be a comma delimited (it's
not) .. could be a unisys file? I know it's not much information to go
on - but where should I start in trying to get this into my database
without knowing the format? Any suggestions would be greatly
appreciated.
Thanks!
Bethany
*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!A google search turned up
SDF = Windows SQL CE-version internal databases
SDF = MDL Isis SDF chemical modeller input file
SDF = Mime: application/e-score
SDF = Mime: chemical/x-mdl-sdf
SDF = Source Definition File (Sourcer)
SDF = System Data Format file (fixed lenght ascii text)
> know it's not much information to go on -
> but where should I start in trying to get this into my database
> without knowing the format?
All you can do is make assumptions and try to identify the file format via
trial-and-error. You might get help with identifying/eliminating SQL CE
from the microsoft.public.sqlserver.ce.
Once you've positively identified the file format, you can take steps to
either import directly or at least transform data into an intermediate
format suitable for import.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Bethany Holliday" <bhollida@.iupui.edu> wrote in message
news:418a7210$0$14507$c397aba@.news.newsgroups.ws.. .
> Hi all,
> I have a file with an extension of .sdf. I "believe" it is a text file
> of some sort but I am uncertain. The source agency hasn't returned any
> of my calls so I'm wondering if anyone is familiar with this extension?
> I'd like to import the file into my database - when I use DTS and chose
> a text format, regardless of what delimiter I choose, the format is
> still really ugly. when I pull it up in a huge text editor, it is hard
> for me to tell what it is there.
> I saw in one of my searches that it could be a comma delimited (it's
> not) .. could be a unisys file? I know it's not much information to go
> on - but where should I start in trying to get this into my database
> without knowing the format? Any suggestions would be greatly
> appreciated.
> Thanks!
> Bethany
>
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!