We (my colleagues at Miracle and I) are currently in the process of upgrading a customer’s solutions. One of task is migrating 100 gigs of data into an existing database schema. This can easily become a tedious task and can take considerable calendar time. Therefore we decided to disable all constraints on the database, move the data and then enable them again, as we know the data integrity is sound.
We could have written a script that gets all the constraints in the database and generates a SQL script, but there is an easier way…
-- Disable all Constraints
exec sp_MSforeachtable 'ALTER TABLE ? NOCHECK CONSTRAINT ALL'
--Enable all Constraints
exec sp_MSforeachtable 'ALTER TABLE ? CHECK CONSTRAINT ALL'
That’s easy… 😛
I found this on the Internet somewhere – the sp_MSforeachtable procedure is nowhere to be found in books online.
You can check if foreign keys and check constrains are disabled via the sys.foreign_keys and sys.check_constraints views.
Another obvious triviality in our situation is disabling and enabling triggers – that is easy too:
-- Disable all Triggers
exec sp_MSforeachtable 'ALTER TABLE ? DISABLE TRIGGER ALL'
-- Enable all Triggers
exec sp_MSforeachtable 'ALTER TABLE ? ENABLE TRIGGER ALL'
I just got word from Microsoft, that I passed the Microsoft Certified Technology Specialist: .NET Framework 3.5, Windows Communication Foundation Applications certification (70-503).
It was not easy, as the preparation guide was incomplete and I didn’t have time to properly prepare for the test. I took the beta exam in January, but had totally forgotten about it.
I am proud to be one of the first to pass this exam 🙂
With a 32 bit platform a process can only address 2 GB of memory by default on windows. For most applications this is not an issue. But today I really messed up – OOM (OutOfMemoryException) exceptions all over 🙁
I added loads of test data to our 32 bit development servers and tried to run a development utility. It failed and failed and failed with OOM exceptions. The cause was actually quite simple as the application tried to allocate more than 500 MB of consecutive memory.
The utility is a console app and the search index is a WCF server and they communicate via TCP buffered binary format (netTCPBinding).
The console utility retrieved all the unique identifiers in the search index (based on Lucene.Net) to compare these with the records in the SQL Server, to see if everything is in sync. The number of unique identifiers where now in the millions, but that was not the root cause. The process consumed twice the memory, because I use buffered transfer mode instead of streamed.
Buffered transfer mode serializes everything before sending the message from the server to the client. So in effect our WCF server not only had the huge array of unique identifiers but also a serialized version. It never got the stage where it could send the message.
Moral of the story – use streamed transfer mode with large messages.
As this is a development utility, I sidestepped 🙂
I enabled the 3GB switch allowing processes to allocate up to 3 GB of memory, and tried again. Same result: OOM.
I figured that the .Net might have a limitation and perhaps a switch to enable larger memory allocation. Apparently there is a small program called Microsoft COFF Binary File Editor (EDITBIN.EXE) that modifies Common Object File Format (COFF) binary files. This binary file editor can enable applications to address more than 2 GB of memory with the LARGEADDRESSAWARE option.
The steps to enable LARGEADDRESSAWARE for an application:
- Start Visual Studio Command Prompt or run VSVARS32.BAT (in the visual studio subdirectory: Common7Tools) which in effect does the same things.
- Run editbin /LARGEADDRESSAWARE <yourApp>.exe
Tomorrow I will change the implementation to use streamed transfer mode. Promise 🙂
The answer is not 42 🙂 , but 11.417. I will never be able to grasp the vast capabilities of the .Net Framework.
With more than 100.000 member it’s no wonder that I am addicted IntelliSense.
You can get more details on Brad Abrams blog entry.
All the presentations from the product launch are recorded as webcasts. They are available here (it’s in Danish). Later this year the webcast will be moved to TechNet.
I spoke about long overdue features in SQL Server for developers.
Some of the webcasts has poor sound quality (not mine 😉 ), which is a shame.
The webcasts where recorded at Microsoft. They used Camtasia Studio which I highly recommend as it is easy, intuitive and have some great features.
I guess the poor sound quality is due to lack of proper microphones and the recording was done in a regular meeting room. They used the build-in microphone on my laptop. Microsoft Denmark should have better sound equipment and a recording studio if they are serious about webcasts.