This blog as moved to: http://nerditorium.danielauger.com/

Your ASP.NET.* Project is Not Just Your UI Layer



In this post by Jimmy Bogard about “dealing with non-transactional operations that must happen if some transaction succeeds,” the often embraced, but sometimes criticized, Session Per Request pattern (aka Transaction Per Request), came under fire in the comments.
image
I’ve had this conversation with other developers who have raised similar concerns before, and the reason brought up against going with this pattern is based off of the notion that the “Web” project must not orchestrate or indirectly know about the other layers of the application because it is the “UI” layer.
The fact is, that the web project houses two conceptual layers:
  1. It houses the UI layer for the application.
  2. It houses the entry point / bootstrapping for the infrastructure of the application. It is the application.
I am perfectly fine with the plumbing of the web project directly managing transactions. Some part of the application must manage this, and I a believe that the it should be pushed as far out to the seams of the application platform as possible.

All Paths to a Sitefinity Image are not Equal



Recently, while doing a performance pass through a Sitefinity 4 application, I noticed that a public facing page had an unusually slow load time. Of course, the first thing I did was open up Firebug to see where the time was being spent. To my surprise, the majority of the time was spent waiting for the initial response from the server. After a little digging, I narrowed down the time-sink to a custom widget that pulled images from a Sitefinity image album. In particular, the query to figure out which images to display was the source of slothness.

A few things to note:

  • There were about 250 image albums in the CMS (all but 5 or so of them were empty at the time of discovery, however).
  • There were roughly 30 images in the album we were querying against.
  • When the original query was written, there were only a few images in the system, and that is why the issue wasn’t apparent right when the query was written.

Here’s an approximation of the original call to retrieve the images through the Sitefinity API:

In this bit of code, we are querying for all images that belong to the image album named “Foo”. Note that the query is structured in a way where we query through the image object to determine what album it belongs to.

The above query took about 2 seconds to return roughly 30 Sitefinity Image objects. This would not do. Therefore, I took to tweaking the query. I tried several different things while maintaining the original approach, but since Sitefinity’s LINQ provider isn’t a complete implementation, I couldn’t make much headway.

I eventually decided to rethink the approach and query for the images by going in through the album instead of going through the image object.

This reduced the query time down to about .2 seconds.

Matt Chat–The YouTube Channel You Should be Watching if You Played Video Games in the 80s or 90s



mattChat

Nerd service announcement:

If you played video games in the 80s and/or 90s and have never heard of, or watched Matt Chat, you you are in for a treat. One of my friends aptly described it as, “Behind the Music for vintage video games.”

Matt Barton (College Professor and Author) debuted Matt Chat in February 2009 with a low-production-quality, but loving, retrospective of SSI’s classic AD&D CRPG, “Pool of Radiance.” Since then, Matt has produced an additional 100 episodes, and has made leaps-and-bounds in production value. In addition to his editorial retrospectives, Matt began doing interviews with game developers around episode 40.

To whet your appetite, here is a listing of the first 101 episodes of Matt Chat :

1: Pool of Radiance
2: Myst
3: Defender of the Crown
4: M.U.L.E.
5: Elite
7: The Sims
8: The Secret of Monkey Island
9: The Oregon Trail
10: Lemmings
11: Civilization
12: Metroid
13: Adventure
14: The Lost Vikings
15: The PLATO Computer System
16: Lode Runner
17: Ultima VII, The Black Gate
18: Summer Games
19: Gauntlet
20: Worms and Artillery Games
21: Super Mario Kart
22: Deja Vu, Uninvited, Shadowgate, and MacVentures
23: Planescape Torment
24: Star Control II and the Spacewar Legacy
25: Knights of the Old Republic
26: David Crane's Ghostbusters
27: Autoduel
28: Maniac Mansion
29: Wizardry
30: Fallout
31: A Rockstar Ate My Hamster
32: Tomb Raider
33: Jade Empire
34: System Shock 2
35: Alone in the Dark
36: Starcraft
37: Syndicate
38: Legacy of the Ancients
39: World of Warcraft Part One
39: World of Warcraft Part Two
40: Sword of Fargoal with Jeff McCord
41: The History of Cinemaware with Bob Jacob
42: Dragon Age Origins
43: Archon
44: Ralph Baer, the Father of Videogames
45: Rogue
46: Choose Your Own Adventure with R.A. Montgomery
47: Quest for Glory
48: Dungeons of Daggorath
49: Nancy Drew featuring Jessica Chiang
50 Part 1: Leisure Suit Larry featuring Al Lowe
50 Part 2: Leisure Suit Larry featuring Al Lowe
51: Interview with John Romero (Early Days)
52: Wolfenstein 3D with John Romero
53: Doom with John Romero
54: Quake with John Romero
55: Daikatana with John Romero
56: Ocarina of Time
57: Tunnels of Doom
58: Heroes of Might and Magic
59: The Settlers
60: X-COM, UFO Defense
61: Sid Meier's Pirates
62: Chris Avellone's Early Days
63: Planescape Torment with Chris Avellone
64: Sean Cooper's Early Days
65: Syndicate with Sean Cooper
66: Fallout with Tim Cain, Pt. 1
67: Fallout with Tim Cain Pt. 2
68: Arcanum and More with Tim Cain
69: Howard Scott Warshaw's Early Days
70: ET and Yar's Revenge with Howard Scott Warshaw
71: The Bard's Tale
72: Deus Ex
73: The Dig
74: Dune II
75: Interview with Megan Gaiser and Rob Riedl of Her Interactive
76: King's Quest
77: Darklands
78: Arnold Hendrick Interview Pt. 1
78: Interview with Arnold Hendrick Pt. 2
78: Interview with Arnold Hendrick Pt. 3
79: Scott Adams' Early Days
80: Adventureland with Scott Adams
81: Questprobe and More with Scott Adams
82: Interview with Rebecca "Burger" Heineman Pt. 1
83: Rebecca Heineman Pt. 2
84: Rebecca Heineman Pt. 3
85: Rebecca Heineman Pt. 4
86: Bard's Tale IV and Wasteland II with Rebecca Heineman
87: Twilight Scene it with Don Kurtz (censored)
88: The Donimator Gets His
89: Bard's Tale and Wizardry with Brian Fargo
90: Wasteland and Fallout with Brian Fargo
91: The Fall of Interplay with Brian Fargo
92: Mail Order Monsters
93: Scratches and Asylum with Agustín Cordes
94: Interview with Agustín Cordes Pt. 2
95: Skylanders and more with Paul Reiche and Fred Ford
96: Star Control and More with Paul Reiche and Fred Ford
97: The Horde and More with Fred Ford and Paul Reiche
98: Scott Miller Interview Pt. 1
99: Duke Nukem with Scott Miller
100: Scott Miller will Live Forever
101: Baldur's Gate

My Sister Always Said I’d Turn Out to be a Nerd

smokingNerdToday was a good day. As mentioned previously, I’ve been looking for a new career opportunity for the past month-and-a-half or so. My journey came to a head yesterday when I received two fantastic job offers. I’m happy to report that I have accepted an offer to join 
the Nerdery as a Software Engineer. I’m looking forward to working with this group of people who are way smarter than I am, and for the chance to grow with this unique company.

Additionally, I’ve been reflecting on the overall job search experience. I am very aware of how lucky I am to have rubbed shoulders with many insanely smart and experienced professionals. I really appreciate the time everyone took to meet with me. Life is good.

Full Stack Interview Coding Challenges

Challenge AcceptedIt’s been seven years since I was last looking for a job, so I’m not sure if this is a new trend or not: Over half the companies I’ve started the interview process with have given me time-unlimited full-stack coding challenges as part of the interview process. I just completed my third one in two weeks and I thought I’d share my thoughts about this practice in general.

Challenge Requirements
Here are highly condensed versions of the requirements for each challenge:

Company A:

  1. Create an ASP.NET MVC application that consists of one page that allows users to add, remove, and rate movies. Movie ratings are to be indicated with a star rating control similar to what Netflix uses.
  2. Create a Javascript widget that a user can host on their blog which pulls and displays their movie ratings from the application.

Company B:

Create an ASP.NET MVC website that allows users to upload and view images. If the image is more than 500 pixels wide, or 700 pixels tall, downsize the image to the maximum allowed size, but maintain the aspect ratio. All images should be stored on the server, including the original image when images are resized.

Company C:

Create an ASP.NET application (Webforms or MVC) that allows users to add and vote on Xbox game titles for the company break room. Users can only vote once a day, and they cannot vote on the weekends. Any user can mark a game as owned, in which case it ends up on a list of owned games. The titles and votes will be stored / retrieved via a set of WCF services that are provided for you. In addition to sending the code in for review, submit a URL to a running copy of the application.

What I Liked About the Process

First off, I really liked that these companies wanted to see code. I think far too many places hire developers without seeing a line of code. True, you can determine what somebody is capable of through conversation (especially at senior / expert levels), but I think it’s a pretty big risk. Some people are really good at talking, and / or look good on paper when in truth they don’t know what they are doing.

Secondly, I really enjoyed doing these challenges. They’ve helped me from getting rusty during this period of unemployment. Additionally, I had to learn at least one thing for each of the applications. For company A’s challenge, I had to learn how to create a widget that used JSONP. For company B’s challenge I had to learn to resize an image while keeping its aspect ratio using only the core .NET framework. For company C’s challenge, I learned how to use the MVCContrib Grid and the jquery.dataTable plugin. I also learned and used AppHarbor to host the application.

What I didn’t Like or was Uncomfortable With During the Process

Finding time to do all three of these within a couple of weeks was a little rough. I did find the time, but part of me was wondering if the companies were wondering why I hadn’t turned the challenge in yet. In reality they didn’t know that I was working on three of them simultaneously. This is something I probably should have communicated to them.

The thing I was most uncomfortable with was that I felt like I had more than enough rope to hang myself with in terms of doing things that weren’t wrong per say, but that the reviewer might frown upon due to personal taste. One thing that I was really torn on was how C# 2/3/4 idiomatic I should make my code. Taking advantage of generics, lambdas, linq, anonymous types, etc… could cause the code to look like gibberish to the reviewer if he or she is still writing code in C#1 style. On the flip side, I felt that I would look like I was stuck in the past if I didn’t write modern C#. I’m not sure that finding the middle ground is the right thing to do in this case either as it could look inconsistent.

Along the same lines, I found it very difficult to decide on the right level of architectural complexity to use. These apps were all a bit more than trivial, so I could have gone either way with the complexity. As with idiomatic C#, I found myself wondering if the reviewer was going to think I was oversimplifying or overcomplicating things. Additionally the reviewer may actually want me to overcomplicated things a bit to show what architectural patterns I know. 

These two issues are things that usually get sorted out when you hash over ideas with people in person, but I found it difficult to know who my audience was when given nothing but a set of requirements.

Final thoughts

All-in-all, I thought it was a positive and enjoyable experience. If I’m ever in the position to help with hiring again, I would probably really push hard for code samples to be part of the process. However, I probably wouldn’t give such open ended challenges. I’d probably give multiple smaller and more focused problems.

.NET Dependency Management in a Pre-Nuget World


This post is an attempt to capture how my previous team dealt with dependency / package management. The team, at its largest point, consisted of about 15 developers. There were roughly 200 3rd party dlls, and roughly 150 internal dlls in the dependency mix. No single app needed all 350 dlls, but groups of these dlls were common to applications in the same domain space of the enterprise. 3rd party dlls were things such as the MS Enterprise Library, image conversion libraries, desktop scanner interop libraries, etc… Internal dlls were things such as common utility libraries, WCF service contracts, DTOs etc…

Until the recent development of projects such as Nuget and OpenWrap, dependency management in .NET has been a big problem. The larger a development group’s topology is, the more of a pain point it is. Because this is the case, a lot of teams don’t even realize they are going to have issues until the pain is upon them. Additionally, I think the complexity of describing the problem has helped to keep package management as an elephant in the .NET room for a long time.

Since there has been abundant discussion on this issue lately, I’m going to skip describing the problems with dependency management and get straight into what we tried over the years, and what the final solution ended up being. It’s important to note that my team used TFS (2008) for source control because some of the steps we took were in response to TFS’ weaknesses.

First attempt ending in Failure:

  • Manage all 3rd party dlls by putting them in a [sln]/bin folder (tracked by TFS).
  • Manage all internal dependencies as shared projects across multiple slns.
I’m sure some of you are already cringing after reading the last bullet point, but you’ll have to admit it is the first solution that comes to mind to a lot of developers. Additionally, it is a very simple solution. The biggest issue with this solution is with sharing a project over several slns. Unfortunately, any bit of code in those shared projects is easily changeable from multiple slns; therefore it is very easy to break several slns in one fell swoop. Secondly configuration management goes out the window because every time a sln is compiled, each shared project gets a new version that bypasses any sort of change management.

Second attempt ending in failure:

  • Manage all 3rd party and internal dlls by putting them in a [sln]/bin folder.
This solution complicates things slightly (in a positive way) by requiring each sln to use a proper release of all internal dlls; therefore you are able to have some degree of configuration management. However, this solution failed for us because of one simple fact: TFS does not track binaries well. We never ran into this weakness with the first solution because all of the 3rd party dlls were set-and-forget. They never changed after setting the initial reference. The internal dlls however, would change daily, or even hourly. At first this seemed like the perfect solution, however all sorts of strange errors started occurring. One person would get latest and everything would work, but another person would get latest and have errors. TFS simply could not tell if you needed a new version of the dlls in the /bin folder. Sadly, even a “clean all" and "get specific version" didn't fix things a lot of the time.

Third attempt ending in Success:

  • Build a custom package management system.
Our custom package management system worked thusly: We created a network share. For each logical release (i.g. MS EntLib, Internal.Common.Util) we created both a [networkShare]\[package]\LatestRelease folder and a [networkShare]\[package]\Release_[version number] folder. We then created a console EXE with a bunch of options that would: 1) go and grab all of the latest dlls for all projects and dump them into one pre-defined “latest” folder on the developer PC and 2) Recreate the “releases” tree structure on the developer PC. This way, developers have the ability to drink from the fire-hose (the latest folder) or go for set releases (from the releases folder tree). TFS is not tracking dlls at all. Rather, we are relying on an absolute path file reference. Additionally, calling this executable becomes a build task on the CI server.

This solution isn't perfect. Namely, it relies on developers to remember to run the executable from time to time. Additionally, there are potentially some versioning scenarios that could occur between packages that expect a different versions of sub dependencies. However that issue never manifested itself in our environment.

Nuget is not the final answer for teams using TFS
I've been following the Nuget dev list closely, and Nuget is considered to be a development time dependency resolver only, not a build time resolver. This means that if you use TFS to track your Nuget package folders, you could still run into dll versioning issues.

Update (4/17/2011)
Nuget team member David Ebbo has blogged that functionality has been added to allow use of Nuget without committing the packages folder to source control.http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html

Update (4/30/2011)
According to Phil Haack, Nuget is going to be getting official support for non-committed packages: http://haacked.com/archive/2011/04/27/feedback-request-for-using-nuget-without-committing-packages.aspx



I know that many of us have had to face this problem, therefore I'd really enjoy hearing about how you addressed this issue.


Moving On

Yesterday was a tough day. I resigned from my position of seven years as a Developer / Analyst for the Office of the Minnesota Secretary of State.

Over the past couple of months, the development group has gone through several reorganizations. Yesterday another reorganization happened, including a substantial software platform shift. I did not feel this shift was good for my career, nor was it something I wanted to participate in. Additionally, I think the days of in-house software development for government agencies are coming to an end at light speed. Therefore, with heavy heart, I felt it was time for me to go. Up until recently, it was the most enjoyable job of my career. I wish my former coworkers the best of luck, and I look forward to hearing their war stories as time goes on.

The Case of Web Deploy 2.0 and the Missing MSDeploy.exe


This week, I decided to install the newly released Web Deploy 2.0 on my machine at work. I already had Web Deploy 1 on my machine, so I decided to uninstall that first before installing 2.0. After the installation I begain getting the following error when trying to execute VS2010 created web deployment packages from the command line:

Error:  The system was unable to find the specified registry key or value
msdeploy.exe is not found on this machine. Please install Web Deploy before exe
cute the script.
Please visit http://go.microsoft.com/?linkid=9278654
=========================================================
=========================================================

Of course the first thing I did was validate that MSDeploy was indeed installed with Web Deploy 2.0. It was. I then tried adding MSDeploy's location to the path environment variable. No dice. I then tried reinstalling Web Deploy 2.0. Still no luck. After trying some other things I cracked open the deployment package's cmd file and found this:
@rem ---------------------------------------------------------------------------------
@rem if user does not set MsDeployPath environment variable, we will try to retrieve it from registry.
@rem ---------------------------------------------------------------------------------
if "%MSDeployPath%" == "" (
for /F "usebackq tokens=2*" %%i  in (`reg query "HKLM\SOFTWARE\Microsoft\IISExtensions\MSDeploy\1" /v InstallPath`) do (
if "%%~dpj" == "%%j" ( 
set MSDeployPath=%%j
))

For whatever reason, Web Deploy 2.0 did not add either of these items during install. I ended up adding the MSDeployPath environment variable, and all was good.

Programming is a Craft


Here are my initial thoughts after reading: Programming is not a Craft

“craft -noun 1. an art, trade, or occupation requiring special skill, esp. manual skill: the craft of a mason." (dictionary.com)

“In English, to describe something as a craft is to describe it as lying somewhere between an art (which relies on talent) and a science (which relies on knowledge). In this sense, the English word craft is roughly equivalent to the ancient Greek term techne." (wikipedia.com)

In my opinion, on a whole, software development is a combination of craft and primitive engineering (which is slowly emerging as a true engineering discipline). The term craft usually means how something was made, not what was made or the perceived value of the end result by the layman. One of the commenters on the original post mentioned carpentry as a craft that may not always be defined by the value placed on the end product by the layman. In this vein, I will also mention that witchcraft is considered a craft.

One of the hallmarks of craft is that that it is filled with heuristics and folklore. Much of the work of today's software developer is squarely within this realm. More often than not, there is no clear-cut way to solve a problem. We craft a solution based off of our experiences and the folklore we encounter. It is worth noting that I consider blogs, google searches, MSDN, visits at a coworker's desk, irc, and stackoverflow.com to be sources of folklore.

I must say that I do agree with much of the post, but I think the terminology that is presented is incorrect. I think it would have been wiser to say that software development is most often not an art, and ego has no place in it.

I also have to concede that in hindsight, I believe that the software craftsmanship manifesto was a bad idea. I recently learned there was a period of time when it was being debated if there should be clause in it to forbid using anything (tools, languages, etc...) that was not OSS. In my opinion the manifesto only serves to force a definition on something that needs no additional definition.

The Mythology of Commodore Told in about 15 Minutes by Jim Butterfield

jimbutterfield-centrefold

 

 

 

 

 

 

 

I was cleaning up the data drive on my computer tonight and I came across this gem, which is also hosted on blip.tv:





Many of us owe our career to Butterfield, and the father of the 65XX - Chuck Peddle.
If you are interested in the history of Commodore, please check out Brian Bagnall’s: Commodore: A Company on the Edge.