Forum Settings
       
Reply To Thread

SQL HelpFollow

#1 Nov 02 2010 at 12:49 PM Rating: Good
**
568 posts
I know this is more of a hardware forum, but I was hoping to get some SQL help. I figure some of the regulars here would be familiar with SQL. It's a bizarre problem that I wouldn't even know how to properly google.

I have a temp table that I'm inserting rows into. If I perform the following script:

insert into #TempTable
select * from TableA

...I get a "String or binary data would be truncated" error. Ok, that's fine. I guess somewhere in the data there's probably a string value too large. So I checked the string fields and I'm pretty sure there are none that are too large. I verify this with some text length queries on the string columns.

So just to make sure there isn't something I miss, I attempt to insert just the top 100 records. If that works, the top 200. I figure I'll narrow down to the record that has bad data in it. And that's when I come upon something that's totally baffling.

There are 409 records in TableA. If I do this script, it works:

insert into #TempTable
select top 500* from TableA

Yet, if I do this script, it fails:

insert into #TempTable
select * from TableA

wtf? Anyone seen this before?

After some more testing, I find that the script fails when I select the top 507 rows, but it works if I select the top 506 rows. And yet, the actual row count is still only 409! So I don't even know where to begin with this.

Edit: I guess I should mention I'm working in MS SQL 2008

Edited, Nov 2nd 2010 11:59am by enigmaticcam
#2 Nov 02 2010 at 2:02 PM Rating: Good
**
801 posts
probly just another glitch in SQL. I remember once having an issue importing from certain formats where it could only do so many records in one shot and we had to split them into multiple files. Was a real pain too...to make it easier to split by the right number of records we'd dump to excel, split them up, then pull them back in. Forget what the limit was (been almost 5 years now)--could be the same glitch.

Raist
#3 Nov 03 2010 at 8:51 AM Rating: Good
**
568 posts
Probably was a glitch of some sort. I figured out the problem was the table definitions didn't match. I found some of the string fields declared in the temp table were smaller than the regular table, and this seemed to cause the problem despite the fact that none of the data was too large for either table. I still don't know why it was successful some of the time, but ultimately I fixed it by making sure it all matched exactly.
#4 Nov 03 2010 at 6:12 PM Rating: Good
**
801 posts
yeah...in spite of all the tweaking and streamlining they've done with it, MS-SQL can be a little unforgiving sometimes. Even using the DTS wizard it sometimes misses some of the oddities when you map everything out. And the lovely generic message that the datapump failed doesn't help much sometimes either. Glad you were able to work it out.

Raist
Reply To Thread

Colors Smileys Quote OriginalQuote Checked Help

 

Recent Visitors: 104 All times are in CST
Anonymous Guests (104)