Scott Hanselman

Getting LINQ to SQL and LINQ to Entities to use NOLOCK

March 18, '08 Comments [23] Posted in LINQ
Sponsored By

I was visiting a local company where a bunch of friends work and they asked how to get LINQ to SQL to use NOLOCK. They were explicitly asking how to get the SQL generated by LINQ to SQL to add the NOLOCK hints to the end.

However, with NOLOCK (even though "everyone" has used it at some point or another) is generally considered a last resort. Queries that use NOLOCK aren't guaranteed to return correct results or technically, return any results at all.  

SQL 2005 has snapshot-based isolation levels that prevent readers from blocking writers or writers from blocking readers without allowing dirty reads.

Now, I have said that NOLOCK has served me personally very well in the past on systems of some size, but I hear what folks who say no to NOLOCK are saying. It certainly depends on one's definition of "correct results." ;)

There's three ways to get the behavior your want. Using TransactionScope is the recommended way to affect the transaction options associated with the commands generated by either LINQ to SQL or LINQ to Entities.

LINQ to SQL also supports explicitly setting the transaction on the context, so you could get the connection from the context, open it, start a transaction, and set it on the context. This can be desirable if you think SQL 2005 is promoting transactions too often, but the preferred method is TransactionScope.

ProductsNewViewData viewData = new ProductsNewViewData();
using (var t = new TransactionScope(TransactionScopeOption.Required,
    new TransactionOptions { 
IsolationLevel = System.Transactions.IsolationLevel.ReadUncommitted
})) { viewData.Suppliers = northwind.Suppliers.ToList(); viewData.Categories = northwind.Categories.ToList(); }

Here's an example where I used it in some recent code. This TransactionScope could be hidden (layered away) in your DAL (Data Access Layer) or in your Data Context directly if you wanted it to be neater.

A second way is that you can still create and call Stored Procedures (sprocs) from LINQ to SQL and those sprocs could include NOLOCK, TransactionScope is a better choice for LINQ to SQL or LINQ to Entity generated SQL if you feel that your query doesn't need to lock down the table(s) it's reading from.

Note that you'll want to be aware of which statement in your LINQ to SQL actually starts talking to the database. You can setup a query ahead of time and it won't be executed, for example, until someone calls ToList() or the like. It's at this point you'll want to wrap it in the using(TransactionScope){}.

Another third way you could set it at a DataContext level (which, to be clear, would affect every generated LINQ to SQL query executed on that context) would be to execute the command:

SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

...using the connection available inside your DataContext.

Thanks to David Browne and Michael Pizzo for their help on this!

About Scott

Scott Hanselman is a former professor, former Chief Architect in finance, now speaker, consultant, father, diabetic, and Microsoft employee. He is a failed stand-up comic, a cornrower, and a book author.

facebook twitter subscribe
About   Newsletter
Sponsored By
Hosting By
Dedicated Windows Server Hosting by ORCS Web
Tuesday, March 18, 2008 1:00:32 AM UTC
A company I used to work for required all developers to always put the (NOLOCK) hint after every table definition, for every SELECT query-- period. If you didn't, you were in deep doo-doo. Fun, eh?
Tuesday, March 18, 2008 1:33:29 AM UTC
Why is everyone all of sudden using var everywhere? I've seen this in half-a-dozen articles/samples and it just looks so dirty when not using anonymous types for a reason. It's an overly used cliche (what's the alt+num code for the accented e?), but when I see var for no good reason, it's like a car crash, I just can't look away. Totally ruined the whole sample for me.

...excuse me while I go write some xQuery to make myself feel clean again.
Tuesday, March 18, 2008 2:40:22 AM UTC
Ryan,

Are you having a problem with the infererred type? I don't get it..
Tuesday, March 18, 2008 4:02:49 AM UTC
What about UPDLOCK, the single most useful locking hint for large scale transactions that involve reading data that you may want to update?
Garry Stewart
Tuesday, March 18, 2008 4:17:26 AM UTC
Ryan, I can't see why you have a problem with the inferred type. I may have to blog on that. It's just saving typing, it's inferred by the compiler. Why should I have to:

int[] foo = new int[] {1,2,3}

when I can

var foo = new int[] {1,2,3}

Why type int[] twice?
Tuesday, March 18, 2008 4:17:45 AM UTC
Garry - I'm looking into UPDLOCK.
Tuesday, March 18, 2008 12:04:15 PM UTC
i lv svng tpng, alwys a gd pln, rlly hlps rdblty! (srcsm)

I understand that the type information is there in the code, but I'm uncomfortable with casual use of var too. It does increase the impedance experienced when reading through code, because when I hit the declaration I've got to read further into the line before understanding what type I'm working with. It also means it won't be possible to quickly scan down the left hand side of the code to see where types are used in declarations. I know various search functions are only a few key-strokes away, but sometimes you just want to scroll up and down a sort distance to check stuff out.

I don't want to see this becoming common practice - do you hear me, world? ;-)
Tuesday, March 18, 2008 12:30:03 PM UTC
If the "var" thing is all about typing then why include any of that at all? Why not use:


using(new TransactionScope(...))
{
...
}


If you're not actually going to do anything with the instance anyway!!!
Shaun Austin
Tuesday, March 18, 2008 1:44:20 PM UTC
On the ['var' versus explicitly stating your types] issue:

Steve McConnel once said that optimizing for write speed at the expense of readability is fundamentally wrong; you have to keep in mind that your code will be read an order of magnitude more times that it will be written, so it matter most to be clear and easy to read/parse than to be neat and tricky.

That said, I agree with Ryan that unless for anonymous types, you should write the actual type in the LHS instead of just 'var'. Makes for a much nicer "top-down, left-right" code reading experience.

But let's not get all flamewarish on the issue, okay ;-)
Claudio A. Heckler
Tuesday, March 18, 2008 3:12:37 PM UTC
I dunno. It just hasn't been a big issue with me. If the variable is named appropriately, you can infer what type it is most of the time anyhow. And if you're wondering, there's always the ability to hover over it and see just what it's being inferred as. Your mileage may vary.
Tuesday, March 18, 2008 3:45:26 PM UTC
Inferred type I think is great for small snippets and where the type is simple like your foo example...but where it gets hard to read like:

public string MyRoutine()
{
return "hello";
}

public string MyOtherRoutine()
{
var thingy = MyRoutine();
return thingy;
}

That's where I tend to dislike it. If the scope is small it's cool...but for long term readability it's rough...especially like in my example if MyRoutine was in another class. Intellisense is great and you can hover over both "var" and "MyRoutine()" and get the types) but it doesn't help if the code is printed in a book or on a webpage.
Ryan Smith
Tuesday, March 18, 2008 3:56:56 PM UTC
I agree with RyanS. I like using var when I'm writing type info twice on a single line. I also think Shaun's got an interesting idea. Why have a throw-away variable when you just don't need one at all?
Tuesday, March 18, 2008 7:29:01 PM UTC
There are quite a few other hints that you might want to be able to stick into the generated SQL for performance reasons -- "index=", "keep plan", "force order", "loop join", etc.
Curt Hagenlocher
Tuesday, March 18, 2008 7:51:24 PM UTC
NOLOCK is obsolete and is only there for backwards compatibility. Scott's setting Transaction level is the correct way to do things. We spend up a lot of stuff here where I work by not freezing big tables that our webpages where just doing SELECTS on.
John A. Davis
Tuesday, March 18, 2008 8:51:05 PM UTC
Ilya Ryzhenkov had an interesting post regarding "var" suggesting that it can actually _improve_ the readability of your code, assuming you are a responsible developer. RyanS's example of sing "thingy" as a variable name is a bit of a strawman, because that would be the action of an irresponsible developer (unless, of course, thingies are important in your domain).

http://resharper.blogspot.com/2008/03/varification-using-implicitly-typed.html
Tuesday, March 18, 2008 9:21:29 PM UTC
I very rarely comment on blogs, but this time I have an itch that I just gotta scratch....

the question\concern I have is in regard to tight coupling and the lack of "Seperation of concerns" that this seems to introduce.

When setting the transaction isolation level in the application binaries arent we creating a very tightly coupled scenario in which the code has too much intimate knowledge of the DB?

How is a production DBA going to troubleshoot database performance,blocking, and deadlock issues? All we'll see in the profiler is a bunch of inline SQL Statements with no indication of where they live or where they are coming from.

If the isolation level has to be changed, it would require a C# code change, a full QA cycle and a binary redeploy.

Seems to me that placing this in sprocs shields the app from having to know too much about the database.

Futhermore, this makes me think of some comments Bob Beauchemin had on his blog regarding performance tuning SQL generated by LINQ. He's right.....if we cant control the SQL that is generated by the API, we will likely have huge performance issues that cannot easily be fixed without a huge redesign at the application level.

Currently, as all SQL code lives in sprocs, when I see a performance issue, I can turn the profiler on an very quickly determine the sproc that is the troublemaker and then I can quickly and easily modify the sql as needed to optimize it. Knowing all the way there are many ways to write the same query while yielding the same results but very different performance. Sometimes I just need to change a join to use a correlated sub-query instead. Somtimes I need to tweak the method used for filtering my WHERE Clause.....

while making changes to the sproc I can instantly see the performance gains by viewing the query plan and monitoring sub-tree cost, etc. If LINQ is generating SQL for me, this becomes impossible.

Am I worried about nothing here or are these issues really going to exist with the wide adoption of LINQ?




GAJ
pdxJaxon
Tuesday, March 18, 2008 9:38:27 PM UTC
Yeah 'thingy' might be a little obtuse...I think another thing for me is that var was introduced for the purpose of anonymous types and their relationship with LINQ...it's pretty unlikely to me that an 'alternate' usage of some language construct had another benefit...I'm sure there are examples to prove me wrong in a most glorious manner, but this still has an element of code stink to me.
Ryan Smith
Tuesday, March 18, 2008 9:42:47 PM UTC
@pdxJaxon

You are very right as described LINQ to SQL does cross that line of coupling...for some development houses that might be fine, for others it's a violation of a cardinal rule...thankfully LINQ to SQL can also employ sprocs...so the best of both worlds is really available to you. So you can keep that isolated if you so wish.
Ryan Smith
Tuesday, March 18, 2008 10:19:31 PM UTC
oh. LINQ can call sprocs?

I was not aware of that.

Guess I need to study up on LINQ a bit more before panicking myself into an early grave.


thanks

GAJ
pdxJaxon
Tuesday, March 18, 2008 11:24:13 PM UTC
I tend to view code using optimizer hints as a pretty strong sign that someone's doing something wrong. One place I worked started dropping them into all their queries because they were always getting blocked. Turns out, they were using the Data Access Application Block, and always passing in a connection string instead of a SqlConnection. With the DAAB, that meant that every single query was running on a new connection, even though each query was really a subordinate piece of a larger operation (that _really_ should have been executed against a SqlTransaction). It also meant they were artificially inflating the number of open connections, and bumping up against connection limits.

If you're using optimizer hints, you're probably fixing the _result_ of your problem, rather than the _cause_ of your problem.

Dewayne Christensen
Tuesday, March 18, 2008 11:54:29 PM UTC
Maybe it's that I come from a VB.NET backgound where you can do this:

Dim o as New MyType()

But I really like var. It doesn't just save on typing the type twice on the same line, it keeps you from having to read the type twice. To me, it reduces noise in my code. So I'm using it in my code and there's nothing you can do to stop me. Muhaha!

Maybe one of you guys need to build a refactor tool that changes all the inferred types to declared types.
Tuesday, March 18, 2008 11:56:15 PM UTC
DeWayne - as long as the connection strings were all the same and connection pooling wasn't explicitly disabled, every single query was probably not running on a separate database connection. The .NET SQL provider provides connection pooling by default. There is not necessarily a 1-1 mapping of SqlConnection object instances to actual database connections.
But I agree with your overall point ;)
Wednesday, March 19, 2008 12:30:21 AM UTC
I've asked and asked in so many forums, e-mails, etc. but have not got any satisfactory answer. So I thought I might try it again. So here gors.

If I am using TableAdapters for all my DB needs. What do I really gain in LINQ? In fact, am I not peppering my code with all those SQL statements, which is a very bad thing to do if I have to change, debug, etc.
Comments are closed.

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way.