This blog as moved to: http://nerditorium.danielauger.com/

Getting RVM to Work when GVim is Launched from Ubuntu's Menu


If you want to use GVim as your Ruby editor in Ubuntu (and most likely any other Gnome based distro), you've probably found out that your .bashrc file is not read when launching GVim from the Gnome menu. This means that your RVM paths are not available in the scope of apps launched from the menu. However, when launching apps from the menu, the launcher can access your .profile file. That being the case, here is a quick work-around for this issue:

1) Add the following code to your ~/.profile file:
# This loads RVM into the session scope of the launcher.
[[ -s "$HOME/.rvm/scripts/rvm" ]] && . "$HOME/.rvm/scripts/rvm"
2) Create a script that you can use to start up GVim. Example ~/Apps/GVim/gvimstart.sh. In this script, place the following:
#!/bin/bash
source ~/.profile 
gvim
3)Right click on the applications menu and choose "Edit Menus". Then find your GVim launcher and point the command entry to your script.

I am, by no means, a Gnome expert, so please let me know if you are aware of a better solution.

Happy coding!

Learn WPF for Free


In my last WPF related post I spoke a bit about my WPF learning experience. I was fortunate enough to have an MSDN Universal subscription and any book I wanted via my employer as I went down the path. Unexpectedly, I recently received an email from a reader who wanted to learn WPF on a budget of $0.00. After thinking for a bit, I came to the conclusion that WPF can be learned without spending any money (assuming you have a computer that can run the tools). Below is a roadmap on how to do so.

Get the tools for free

Visual Studio Express, which is able to create WPF apps can be found here:
http://www.microsoft.com/express/Windows/

If you need any other tools, such as SQL Server Express, you can most likely find them via the Web Platform Installer here: http://www.microsoft.com/web/downloads/platform.aspx

Learn the Framework

Between MSDN and the community, there is more than enough well-written documentation out there to help one become an expert on the WPF framework itself. In his book, “Advanced MVVM”, WPF expert, Josh “The Maestro” Smith, recommends the following documents to come up to speed with the WPF framework:

Introduction to WPF: http://msdn.microsoft.com/en-us/library/aa970268.aspx
WPF Architecture: http://msdn.microsoft.com/en-us/library/ms750441.aspx
A Guided Tour of WPF: http://joshsmithonwpf.wordpress.com/a-guided-tour-of-wpf/
Customize Data Display with Data Binding and WPF: http://msdn.microsoft.com/en-us/magazine/cc700358.aspx
ItemsControl: ‘I’ is for the Item Container: http://drwpf.com/blog/2008/03/25/itemscontrol-i-is-for-item-container/

A lot of the information in the above links is difficult to absorb, but I think it is very worthwhile to go through the material, and work through the examples, at least once. If some of the documents are unclear, you can always go back to them as you learn more and need more clarification.

Learn MVVM

Many people blow off MVVM as “the latest popular design pattern”, but learning MVVM is essential to working with WPF. This is because MVVM is a natural pattern to use considering the way databinding works with WPF. Yes, you can write WPF apps using tried-and-true code-behind and click event handlers like people did with winforms, but you’ll be doing yourself a disservice.

When learning MVVM, the first place to look is this very simple video by the Jason “The Enigma” Dolinger, where he explains what MVVM is, and what the pattern’s strengths are: http://blog.lab49.com/archives/2650

The next place I would look is the “MVVM In the box” training by Karl “The Educator” Schifflett:
http://karlshifflett.wordpress.com/2010/11/07/in-the-box-ndash-mvvm-training/

As a lesson summary, an in-depth overview of MVVM written by Josh Smith can be found here: http://msdn.microsoft.com/en-us/magazine/dd419663.aspx

Apply what you’ve learned

While there is no WPF “best practices” sample that I am aware of, I think the following can get one to see how WPF fits into an application architecture.

Building a Desktop TO-DO application with NHibernate: http://msdn.microsoft.com/en-us/magazine/ee819139.aspx
Understanding the MVVM Pattern: http://live.visitmix.com/MIX10/Sessions/EX14
Build Your Own MVVM Framework: http://live.visitmix.com/MIX10/Sessions/EX15

Not strictly MVVM, but worth looking at as application and framework samples:

PRISM: http://compositewpf.codeplex.com/
Caliburn Micro: http://caliburnmicro.codeplex.com/
Caliburn Micro Soup To Nuts Series: http://caliburnmicro.codeplex.com/documentation
MVVM Light: http://mvvmlight.codeplex.com/

Keep Learning

Read Blogs! Search out WPF related blogs. There are enough out there to fill your news reader every day. Pete “6510” Brown  puts out a Windows Client roundup frequently which is a good starting point.

Read and participate in the knowledge dump at http://stackoverflow.com/questions/tagged/wpf.
At first you may find that some of the questions are the same things you are wondering about. After awhile, you will find that you can answer some of the questions. As you answer questions, you’ll reinforce what you’ve learned.

Share

Software development is still largely a folklore based discipline. It is your duty to share what you have learned in the best way you know how.

Katamari Code


We’ve all worked on projects where the codebase is a mess. Here are a couple of common messy codebase analogies I’ve heard over the years:

The House of Cards Codebase:

The Jenga Codebase:

Although I can appreciate the above analogies, more often than not, I think Beautiful Katamari is the most appropriate analogy.

Time Until Productivity In WPF


kingkongtorso One of the things I see over-and-over-again when reading about teams that are deciding if they should adopt WPF or not is the fear of the learning curve, and the worry that they will not be productive for a huge period of time. Given that I’ve recently become productive in WPF myself, I thought I would talk about my experience in this area.


A little background about myself: I would consider myself to be your average mid-30 year old ALT.NET developer. I got my start programming BASIC and 65xx assembly as a kid in the early 80s on a Commodore 64, got my first programming job doing Java in the late 90s, and started using ASP.NET in the late 1.0 beta days. Programming is one of my hobbies, but I am not the type that stays up all hours of the night working on my pet OSS project. I am also not a “computer scientist” or language wonk. However, I am passionately interested in software craftsmanship. S.O.L.I.D. is prominent in my tool belt.

My attention was first turned to WPF in the early fall of 2008. My group was facing a large desktop project. The question came up as to if we should go with Winforms or WPF. No one on the team had any practical Winforms experience, so the initial reaction was that we should go with WPF since Microsoft had made it clear that it was the future of Windows desktop development. However, we all had heard rumblings about how difficult WPF was. That being the case, each member of the team created a very small drag-and-drop application to test the waters. The general consensus was that if you ignored the more advanced features of WPF, it was just as easy as Winforms drag-and-drop + code behind development. Mind you, we did not want to do that sort of development, but it became apparent that that style of development would be equally painful using either framework. Note that at this time, I really had no idea how to make a real application using WPF, but I did understand the very basic concept of how XAML markup changed the game from Winforms development.

Estimated time spent learning WPF during this period: 8 hours.

After the initial decision making process, I returned to ASP.NET and WCF development. The WPF project didn’t really get started until spring 2009. I was not slated to work on the project, but I was enlisted to help decide the preliminary architecture. Once I was given that task, I began to look around for application frameworks, or at the very least, some patterns that had momentum behind them. At this time, I learned about the MVVM pattern via the still insightful Jason Dolinger video. It really seemed to click in my mind so I began looking for an application framework to support this pattern. I did not find a mature application framework per say, but I did come across PRISM. There were a few other budding frameworks at the time, but PRISM was way ahead of the game. Therefore, we decided to go with PRISM using the MVVM presentation pattern on the client side of the application. Once that decision was made, I was back off the project and doing other things. I did however start following a few WPF blogs at this time, but I did not do any WPF coding.

Estimated time spent learning WPF during this period: 8 hours.
Total estimated time: 16 hours.

This July, my schedule freed up and I was put on the project 50% of my time. I was tasked with a reporting interface that had to show a list of available reports and then dynamically show parameter UI elements depending on which report was selected.

As soon as I was assigned to the project, I got a copy of WPF in Action with Visual Studio 2008. I read the book half the day during my work week, and then a few hours each night at home. As someone who learned programming through type-in programming, I made sure that I recreated all of the samples myself. Also at this time, I grabbed the latest version of PRISM and took a peek at the examples.

Estimated time spent learning WPF during this period: 30 hours.
Total estimated time: 46 hours.

The first week of actual development was pretty brutal. MVVM was not the problem. PRISM was not the problem. XAML and databinding is where I was banging my head against the wall. It kind of worked like HTML, and it kind of didn’t. That first week was pretty aggravating, but pretty much every problem I encountered was just an internet search away.

Estimated time spent learning WPF during this period: 30 hours.
Total estimated time: 76 hours.

I became productive after that first week. I’ll be the first to admit that I don’t intimately know how WPF works behind the scenes (like I do with ASP.NET). However, I do think that I became productive within a reasonable amount of time.

Looking back, I can safely say it took me about 80 hours to go from expert ASP.NET programmer to productive WPF programmer.  That being said, I do not know how long it would have taken me to go from expert ASP.NET programmer to productive Winforms developer. Something tells me it wouldn’t have been a number that would have made us use Winforms instead of WPF.

Should .NET Auto Properties Have Unit Tests?


testing_scaled

Should .NET auto properties be unit tested? It is very easy to argue that testing auto properties falls into the “testing the .NET framework” smell and is a waste of time. However, experience has shown me otherwise. This is something that I’ve gone back and forth on many times since auto properties were introduced. For the time being, I think I’ve reached a conclusion. That conclusion is yes.

 

I typically do two types of testing: 1) Test Driven Design, and 2) Test while/after unit testing. Let’s consider these two scenarios.

When doing test driven design, there is no question. You should write tests for your auto properties. Test driven design is not about testing; it’s about design. The rule of thumb is that you don’t write any code without a test dictating its need. Case closed (in general).

Things get trickier when doing test while/after unit testing. This type of testing is more about creating a test harness. When writing tests after the fact, I know that the property is implemented using auto properties. Therefore, if I am taking a white box approach, I know that my implementation of the encapsulated property is simply the .NET framework. Following that line of thought, testing auto properties is testing the .NET framework. However, I tend to view test while/after testing as a test harness of the public contract. This is black box testing. A public property is an encapsulation, and therefore it should be tested. It’s not uncommon for an auto property to be converted into a regular property as an application lives on. I’d have to say that I would want a test there to capture any publicly facing change in that encapsulation.

Persistence Ignorant Lazy Loading For Your Hand-Rolled DAL In .NET 4.0 Using Lazy<T>


This post is a brief update to the .NET 3.5 article I posted about P.I. lazy loading. The only major change I have made to the code is to use the new Lazy<T> class that was introduced in .NET 4.0. This considerably cleans up the LazyLoadingList<T> class from the previous post.

Here is the new LazyLoadingList<T>:
public class LazyLoadingList<T> : IList<T>
{
private Lazy<IList<T>> _lazyList;

public LazyLoadingList(Lazy<IList<T>> lazyList)
{
_lazyList = lazyList;
}

#region Implementation of IEnumerable

public IEnumerator<T> GetEnumerator()
{
return _lazyList.Value.GetEnumerator();
}

IEnumerator IEnumerable.GetEnumerator()
{
return _lazyList.Value.GetEnumerator();
}

#endregion

#region Implementation of ICollection<T>

public void Add(T item)
{
_lazyList.Value.Add(item);
}

public void Clear()
{
_lazyList.Value.Clear();
}

public bool Contains(T item)
{
return _lazyList.Value.Contains(item);
}

public void CopyTo(T[] array, int arrayIndex)
{
_lazyList.Value.CopyTo(array, arrayIndex);
}

public bool Remove(T item)
{
return _lazyList.Value.Remove(item);
}

public int Count
{
get
{
return _lazyList.Value.Count;
}
}

public bool IsReadOnly
{
get
{
return ((ICollection<T>)_lazyList.Value).IsReadOnly;
}
}

#endregion

#region Implementation of IList<T>

public int IndexOf(T item)
{
return _lazyList.Value.IndexOf(item);
}

public void Insert(int index, T item)
{
_lazyList.Value.Insert(index, item);
}

public void RemoveAt(int index)
{
_lazyList.Value.RemoveAt(index);
}

public T this[int index]
{
get
{
return _lazyList.Value[index];
}
set
{
_lazyList.Value[index] = value;
}
}

#endregion
}


Here are the changes to the invoking code:
public class CompanyDAO : ICompanyDAO
{
List<Company> _companiesInDatabase = new List<Company>
{
new Company(){Name = "ACME"},
new Company(){Name = "Hardees"}
};

#region Implementation of ICompanyDAO

public Company GetByName(string name)
{
// Write to console to demonstrate when loading is happening
Console.WriteLine("---Loading Company---");

// Pretend we are calling / mapping from a store procedure
var company = _companiesInDatabase.Where(x => x.Name == name).First();


// Create / add the lazily loaded collection
if (company != null)
{
var lazyLoader = new Lazy<IList<Employee>>(
() =>
{
var employeeDAO = new EmployeeDAO();
return employeeDAO.GetByCompanyName(name).ToList();
}
);

company.Employees = new LazyLoadingList<Employee>(lazyLoader);
}

return company;
}

#endregion
}


The full source can be found here: http://github.com/dauger/BlogSamples

How to fail at ORM

NoORM


Let's face it: if trends continue, some form of ORM will be a fact of life at most .net organizations that develop business / enterprise software. Microsoft isn't playing games this time with Entity Framework. They mean for it to succeed. Additionally, at the time of writing this, NHibernate has been downloaded 391,024 times from sourceforge alone (there is more than one place to download it from). This being the case, I’m going to give everyone a few pointers to ensure that their first attempt at ORM fails.

Here are my tips to insure ORM adoption failure (in no particular order):

Consider the ORM’s SQL engine as a replacement for SQL knowledge. The whole point behind ORMs is so that I don’t have to write or understand SQL right? WRONG!

Consider the ORM’s SQL engine to be a black box. I got back the correct dataset, so this must be the best SQL the ORM can produce right? WRONG! Most ORMs will create drastically different SQL depending on how the object query is structured.

Don’t get more than a skin deep understanding of the ORM. If you run into a brick wall with a bit of behavior from the ORM, you can follow two paths. You can A) learn about the finer points of the ORM to resolve the issue, or B) rip the ORM out of your application. The latter is the outcome I’ve seen more often than not. A classic example of this is the N+1 select issue where the app calls the database in a loop. ORMs have things such as eager loading, multi-queries, and future queries to avoid extra trips to the database. However, it’s best to ignore the existence of those features if you want to fail at ORM.

Use ORM generated schema without manually tweaking it. Many ORMs will happily create a schema for the developer just how they specified it, and index free. Ideally you shouldn’t use generated schema at all once you are up and running. A DBA should be creating a schema using relational theory. However, if you want to fail, it’s best to just used that generated schema.

Use the ORM for 100% of your data access.
Most ORMs allow for dropping into prepared SQL, stored procedures, and even db function calls. However, it’s best to ignore this functionality if you want to fail.

Maintain OO purity at all costs.
Does fetching your aggregate root cause an 11 table join? So be it.

Cut the DBA out of the development process.
The whole point of ORM is to cut out the DBA right? WRONG! The DBA should be just as active with helping to craft the data access strategy as they would with a hand rolled data access layer. Cutting the DBA of the picture is a recipe for failure.

Don’t profile your application.
If you want to fail, it’s best to find out if you have have created a SQL nightmare once you hit production. NHProf, EFProf, L2SProf, and SQL Profiler are your friends. Ignore them to fail.

I hope you find these tips helpful. I’d like to hear about any other tips for ORM failure you might have.

Possibly The Most Important C# Interview Question


brain_resized


The Problem

Recently I was reviewing some code at work that was written by a senior developer that had left the organization. I saw something along these lines that set off a huge red flag in my head:

Address newAddress = customer.Address;
newAddress.LineOne = "122 SongbirdLane"; // more changes etc...
customer.Address = newAddress; // RED FLAG!

Although the above code technically works, I took it as a warning sign indicating that the developer probably didn’t understand how references work. Sadly, my suspicions were confirmed after I dug through some more code. Even worse, the application was littered with hacks to fix areas where this misunderstanding manifested problems.

Yes ladies and gentlemen, there are people out there that have been doing C# development since .NET 1.0 that don’t have a functional mental model of how the language works. I wish I could say this was the first time I’ve run into this. Sadly, I’ve run into it several times over the years.

The Solution

It is very easy to weed these people out during the interview process by asking a very simple interview question. That question is:

What does the following program output to the command line?


public class Person
{
public string Name { get; set; }
}

class Program
{
static void Main(string[] args)
{
Person joe = new Person();
joe.Name = "Joe";

Person tim = joe;
tim.Name = "tim";

Console.WriteLine(joe.Name);
Console.WriteLine(tim.Name);

int myNumber = 666;

DoSomething(myNumber);

Console.WriteLine(myNumber);
}

private static void DoSomething(int someNumber)
{
someNumber = 777;
}
}

The trick is that you have to ask it to every developer, no matter how many years they have under their belt.

Speeding Up Cassini In Vista And Windows 7

I was doing some ASP.NET MVC work this evening on my new supa-fast Windows 7 machine using the built-in Visual Studio Cassini webserver. For some odd reason, it was taking a few seconds for my tiny pages to resolve / render. I was using Chrome, so I decided to try Firefox. Firefox was just as slow. I then tried using IE and it performed with the speediness I was used to on my old XP install. I did some digging, and I came across the solution to the problem in two places:

http://www.wagnerdanda.me/2009/12/asp-net-development-server-slow-on-windows-vista7-with-firefox-or-chrome/
http://stackoverflow.com/questions/1416128/my-local-host-goes-so-slow-now-that-i-am-on-windows-7-and-asp-net-mvc

It turns this is an issue with ipv6 and resolving localhost on Vista and Windows 7. The fix is very easy – you simply need to uncomment the localhost entry in your  C:\Windows\System32\drivers\etc\hosts file by changing this:

# localhost name resolution is handled within DNS itself.
#    127.0.0.1       localhost

to:

# localhost name resolution is handled within DNS itself.
     127.0.0.1       localhost

You may need to run your text editor as administrator in order to save the changes.

Practical Persistence Ignorant Lazy Loading For Your Hand-rolled DAL


Introduction – A Word Of Warning

First off - I do not recommend you write your own hand-rolled data access solution for an OO .NET application. Ayende has a great post that should convince you to use something like NHibernate, LLBLGen, Entity Framework, or Linq2Sql. I strongly agree with him. I’ve worked on several projects which had a hand rolled, stored procedure based DAL. All of the DALs that were of a medium-to-large-size eventually turned into a huge mess, or something with tons of friction and poor performance. That being said, I am not allowed to use an ORM at my current employer, and I know that many others are not as well. That being the case, I think this post may be of use for other people in the same situation.

Secondly – I specifically used the word “practical” to describe this strategy. Although there are ways to do this sort of thing with code gen and/or reflection, I am going to assume that most places that don’t use ORMs will find those techniques to be too complex.

Thirdly – This article is focused on the lazy loading aspect of the sample code. The implementation / organization of the data access layer is not to be taken as best practices. It is organized in a way to be easily understood, and so it does not get in the way of the topic at hand. I am also ignoring many other features of a robust data access layer (change tracking, transactions, etc…) as I think they do not need to be understood in order to put this concept in action. However, there is nothing about this implementation that would exclude it from being used with other ORM concepts.

Terminology

As far as this article is concerned, here are the operational definitions of the core terms:

Entity: An object that is persisted to the database that has business related behaviors.

Data Access Object (DAO): A DAO is a data access layer (DAL) object that has the responsibility of calling the database and mapping the results to entities. The Repository Pattern is a relative of the DAO pattern.

Persistence Awareness (PA): An entity that is persistence aware is one that is responsible for its own persistence.

Examples:
var myClass = MyClass.LoadByID(id);
myClass.Save();
A PA entity is either directly or indirectly aware of its data store and is responsible for coordinating the persistence of its children. When the persistence methods are called, it news up the appropriate DAO and gets the results back.

Persistence Ignorance (PI): An entity that is persistence ignorant relies on other classes to handle the responsibility of persisting it.

Example:
var myClassDAO = new MyClassDAO();
var myClass = myClassDAO.GetByID(id);
myClassDAO.Save(myClass);
The coordination of persisting and retrieving child entities is also handled by the external class.

Lazy Loading: Lazy loading is a term that refers to delaying the retrieval of a child collection until it is accessed. This is done to save trips to the database and to reduce application memory consumption. Consider the following code:
var userDAO = new UserDAO();
var company = userDAO.GetByName("Hardees");

Console.WriteLine(company.Name);

foreach(var employee in Company.Employees)
{
Console.WriteLine(employee.Name);
}
If we are using lazy loading, the employee collection would not be loaded until we hit the body of the foreach loop. If we were using eager loading, the employee collection would have been loaded when the company was loaded.

Why should PI be preferred to PA?

There are two reasons which are a bit related. First off, persistence code tends to be tricky and long winded. Often times there is more persistence code in a PA class than business logic. Secondly, you always want to give your class as few responsibilities / reasons to change as possible. The code in your entities should focus on business behavior rather than infrastructure concerns such as persistence.

If PI is preferred, why do most hand rolled data access layers use PA?

In my experience, people usually prefer PA because it makes lazy loading extremely easy. The typical pattern is to hit the database when a property is accessed and its backing field is null. This is a very easy pattern to understand, but unfortunately it sticks our entities with all of the overhead and complexity of persistence concerns.

Less Talk, More Rock!

Now that we have all the background out of the way, let’s take a look at an implementation I’ve tried recently that seems to work well and is very easy to understand. It comes down to two classes which are coordinated by the application's DAO classes. These two classes are:
  • LazyLoadingList<T> - This is a wrapper around List<T> which triggers a load whenever one of its methods is called.
  • LoadDelegate<T> - This is the delegate that is executed when a LazyLoadingList<T>’s load method is triggered.
Here is the code for the LazyLoadingList<T>:
public class LazyLoadingList<t> : IList<t>
{
private List<t> _list;
private LoadDelegate<t> _loadDelegate;

public LazyLoadingList(LoadDelegate<t> loadDelegate)
{
_loadDelegate = loadDelegate;
}

public void Load()
{
_list = _loadDelegate().ToList();
Loaded = true;
}

private bool Loaded { get; set; }

private void LoadIfNotLoaded()
{
if (!Loaded)
{
Load();
}
}

#region Implementation of IEnumerable

public IEnumerator<t> GetEnumerator()
{
LoadIfNotLoaded();
return _list.GetEnumerator();
}

IEnumerator IEnumerable.GetEnumerator()
{
LoadIfNotLoaded();
return _list.GetEnumerator();
}

#endregion

#region Implementation of ICollection<t>

public void Add(T item)
{
LoadIfNotLoaded();
_list.Add(item);
}

public void Clear()
{
LoadIfNotLoaded();
_list.Clear();
}

public bool Contains(T item)
{
LoadIfNotLoaded();
return _list.Contains(item);
}

public void CopyTo(T[] array, int arrayIndex)
{
LoadIfNotLoaded();
_list.CopyTo(array, arrayIndex);
}

public bool Remove(T item)
{
LoadIfNotLoaded();
return _list.Remove(item);
}

public int Count
{
get
{
LoadIfNotLoaded();
return _list.Count;
}
}

public bool IsReadOnly
{
get
{
LoadIfNotLoaded();
return ((ICollection<t>)_list).IsReadOnly;
}
}

#endregion

#region Implementation of IList<t>

public int IndexOf(T item)
{
LoadIfNotLoaded();
return _list.IndexOf(item);
}

public void Insert(int index, T item)
{
LoadIfNotLoaded();
_list.Insert(index, item);
}

public void RemoveAt(int index)
{
LoadIfNotLoaded();
_list.RemoveAt(index);
}

public T this[int index]
{
get
{
LoadIfNotLoaded();
return _list[index];
}
set
{
LoadIfNotLoaded();
_list[index] = value;
}
}

#endregion
}
Here is the code for the LoadDelegate

public delegate IEnumerable<t> LoadDelegate<t>();

The Demo

I’ve put a demo up on Github that ties this altogether. The project is organized as follows:

LazyLoadingCollectionsProject
The Employee load delegate gets wired up in the Company DAO like so:


public class FakeCompanyDAO : ICompanyDAO
{
List<company> _companiesInDatabase = new List<company>
{
new Company(){Name = "ACME"},
new Company(){Name = "Hardees"}
};

#region Implementation of ICompanyDAO

public Company GetByName(string name)
{
// Pretend we are calling / mapping from a store procedure
var company = _companiesInDatabase.Where(x => x.Name == name).First();

if(company != null)
{
company.Employees = new LazyLoadingList<employee>(
() =>
{
var employeeDAL = new FakeEmployeeDAO();

// To demonstrate when loading is happening
Console.WriteLine("Loading Employees");

return employeeDAL.GetByCompanyName(name);
}
);
}

return company;
}

#endregion
}

The Demo program code:

class Program
{
static void Main(string[] args)
{
ICompanyDAO db = new FakeCompanyDAO();

var company = db.GetByName("Hardees");

Console.WriteLine("Company Loaded: " + company.Name);
Console.WriteLine("About to iterate Employees");

foreach(var emp in company.Employees)
{
Console.WriteLine(emp.Name);
}
}
}

The output:

Output


Possible Improvements

There are many possible improvements to this strategy depending on how far one is willing to go. It would be very easy to auto wire up everything with one generic delegate if DAO semantics were uniform across DAOs etc…

Source code: http://github.com/dauger/BlogSamples/tree/master/LazyLoadingCollections/


UDPATE: This technique has been reworked for .NET 4.0 here.

Gnome Evolution’s Missing Feature

Here is a screenshot of the Windows Live Email account setup:
LiveMail

Here is a screenshot of Thunderbird’s account setup:
thunderbird

Here is a screenshot of Gnome Evolution’s account setup:
evolution

Notice anything? Evolution does not give you the option to leave email on the server until you delete it. This will lead to an inbox management nightmare for those of use who multi-boot. Evolution is very nicely integrated into Ubuntu, but sadly I cannot use it because of this missing feature.