I have a database in SIMPLE recovery model with 10GB free.
The file I am trying to load through SSISs bulk load is 1GB.
The table is empty (truncated prior to each loading), is a heap and has no non clustered indexes. It is a table of varchar(255) fields.
When running the bulk insert task, I can watch the database size grow from 2GB to 12GB (max space available) where it runs out of space and returns the following message:
[Bulk Insert Task] Error: An error occurred with the following error message: "Could not allocate space for object 'dbo.tablename' in database 'DBNAME' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup.".
Firstly, I thought because I was in SIMPLE recovery model, bulk loading isn't logged. I know that it is when there is data in the table AND a clustered index, but neither is the case here. I assume the Bulk Insert Task is included in this as it should use BCP. Am I wrong?
Secondly, it seems strange that a 1GB text file would result in 10GB in SQL Server varchar fields. This seems mighty strange. The data is not being transformed, it is being Bulk loaded into a table. Why is it growing so large?