0

(And before anyone points it out; I know they're not calling it DTS any more!)

My problem is as follows: Whenever I go to import a flat file using the DTS import it's setting the default column sizes to be ludicrously small (50 characters) and it doesn't subsequently matter HOW I define the table I'm pulling the data into, it chokes until I correct it.

One (obvious) way round this would be to save the package and go from there. Unfortunately, because of the nature of the files, this isn't all that practical (because this is going to be an issue across a good 10 or 15 different databases).

Is there a way of getting SQL to up its default settings so that I can go back to controlling column size on the table only? Or am I SOL?

3
Contributors
2
Replies
3
Views
8 Years
Discussion Span
Last Post by cutepinkbunnies
0

I'm pretty sure you're SOL. I do a lot of DTS intensive work and I have clients install SQL 2000 so i can use DTS to import data destined for SQL 2005. I don't like what they did to DTS either.

0

Hi Athersgeo,

Believe it or not, this is a feature of SSIS. The best way to handle flat file importing is mapping the flat file connection EXACTLY the way the file is mapped and configuring the source to perform how you want it to when anything different presents itself (failure on truncation etc).

You can easily fix this issue by mapping the file connection and updating the metadata within your dataflow. Once the file uses the correct metadata it will update the rest of the respective tasks that deal with that data. Sometimes I don't like doing it by clicking on the task in question and choosing update, so most times I just re-drag and drop the task flows (the little green and red arrows).

Hope this helps!

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.