This blog as moved to: http://nerditorium.danielauger.com/
I will eventually move posts from this blog over to the new one, and hopefully figure out some sort of way to do redirects for the more popular posts.
In this post by Jimmy Bogard about “dealing with non-transactional operations that must happen if some transaction succeeds,” the often embraced, but sometimes criticized, Session Per Request pattern (aka Transaction Per Request), came under fire in the comments.
I’ve had this conversation with other developers who have raised similar concerns before, and the reason brought up against going with this pattern is based off of the notion that the “Web” project must not orchestrate or indirectly know about the other layers of the application because it is the “UI” layer.
The fact is, that the web project houses two conceptual layers:
- It houses the UI layer for the application.
- It houses the entry point / bootstrapping for the infrastructure of the application. It is the application.
Recently, while doing a performance pass through a Sitefinity 4 application, I noticed that a public facing page had an unusually slow load time. Of course, the first thing I did was open up Firebug to see where the time was being spent. To my surprise, the majority of the time was spent waiting for the initial response from the server. After a little digging, I narrowed down the time-sink to a custom widget that pulled images from a Sitefinity image album. In particular, the query to figure out which images to display was the source of slothness.
A few things to note:
- There were about 250 image albums in the CMS (all but 5 or so of them were empty at the time of discovery, however).
- There were roughly 30 images in the album we were querying against.
- When the original query was written, there were only a few images in the system, and that is why the issue wasn’t apparent right when the query was written.
Here’s an approximation of the original call to retrieve the images through the Sitefinity API:
In this bit of code, we are querying for all images that belong to the image album named “Foo”. Note that the query is structured in a way where we query through the image object to determine what album it belongs to.
The above query took about 2 seconds to return roughly 30 Sitefinity Image objects. This would not do. Therefore, I took to tweaking the query. I tried several different things while maintaining the original approach, but since Sitefinity’s LINQ provider isn’t a complete implementation, I couldn’t make much headway.
I eventually decided to rethink the approach and query for the images by going in through the album instead of going through the image object.
This reduced the query time down to about .2 seconds.
Nerd service announcement:
If you played video games in the 80s and/or 90s and have never heard of, or watched Matt Chat, you you are in for a treat. One of my friends aptly described it as, “Behind the Music for vintage video games.”
Matt Barton (College Professor and Author) debuted Matt Chat in February 2009 with a low-production-quality, but loving, retrospective of SSI’s classic AD&D CRPG, “Pool of Radiance.” Since then, Matt has produced an additional 100 episodes, and has made leaps-and-bounds in production value. In addition to his editorial retrospectives, Matt began doing interviews with game developers around episode 40.
1: Pool of Radiance
3: Defender of the Crown
7: The Sims
8: The Secret of Monkey Island
9: The Oregon Trail
14: The Lost Vikings
15: The PLATO Computer System
16: Lode Runner
17: Ultima VII, The Black Gate
18: Summer Games
20: Worms and Artillery Games
21: Super Mario Kart
22: Deja Vu, Uninvited, Shadowgate, and MacVentures
23: Planescape Torment
24: Star Control II and the Spacewar Legacy
25: Knights of the Old Republic
26: David Crane's Ghostbusters
28: Maniac Mansion
31: A Rockstar Ate My Hamster
32: Tomb Raider
33: Jade Empire
34: System Shock 2
35: Alone in the Dark
38: Legacy of the Ancients
39: World of Warcraft Part One
39: World of Warcraft Part Two
40: Sword of Fargoal with Jeff McCord
41: The History of Cinemaware with Bob Jacob
42: Dragon Age Origins
44: Ralph Baer, the Father of Videogames
46: Choose Your Own Adventure with R.A. Montgomery
47: Quest for Glory
48: Dungeons of Daggorath
49: Nancy Drew featuring Jessica Chiang
50 Part 1: Leisure Suit Larry featuring Al Lowe
50 Part 2: Leisure Suit Larry featuring Al Lowe
51: Interview with John Romero (Early Days)
52: Wolfenstein 3D with John Romero
53: Doom with John Romero
54: Quake with John Romero
55: Daikatana with John Romero
56: Ocarina of Time
57: Tunnels of Doom
58: Heroes of Might and Magic
59: The Settlers
60: X-COM, UFO Defense
61: Sid Meier's Pirates
62: Chris Avellone's Early Days
63: Planescape Torment with Chris Avellone
64: Sean Cooper's Early Days
65: Syndicate with Sean Cooper
66: Fallout with Tim Cain, Pt. 1
67: Fallout with Tim Cain Pt. 2
68: Arcanum and More with Tim Cain
69: Howard Scott Warshaw's Early Days
70: ET and Yar's Revenge with Howard Scott Warshaw
71: The Bard's Tale
72: Deus Ex
73: The Dig
74: Dune II
75: Interview with Megan Gaiser and Rob Riedl of Her Interactive
76: King's Quest
78: Arnold Hendrick Interview Pt. 1
78: Interview with Arnold Hendrick Pt. 2
78: Interview with Arnold Hendrick Pt. 3
79: Scott Adams' Early Days
80: Adventureland with Scott Adams
81: Questprobe and More with Scott Adams
82: Interview with Rebecca "Burger" Heineman Pt. 1
83: Rebecca Heineman Pt. 2
84: Rebecca Heineman Pt. 3
85: Rebecca Heineman Pt. 4
86: Bard's Tale IV and Wasteland II with Rebecca Heineman
87: Twilight Scene it with Don Kurtz (censored)
88: The Donimator Gets His
89: Bard's Tale and Wizardry with Brian Fargo
90: Wasteland and Fallout with Brian Fargo
91: The Fall of Interplay with Brian Fargo
92: Mail Order Monsters
93: Scratches and Asylum with Agustín Cordes
94: Interview with Agustín Cordes Pt. 2
95: Skylanders and more with Paul Reiche and Fred Ford
96: Star Control and More with Paul Reiche and Fred Ford
97: The Horde and More with Fred Ford and Paul Reiche
98: Scott Miller Interview Pt. 1
99: Duke Nukem with Scott Miller
100: Scott Miller will Live Forever
101: Baldur's Gate
Today was a good day. As mentioned previously, I’ve been looking for a new career opportunity for the past month-and-a-half or so. My journey came to a head yesterday when I received two fantastic job offers. I’m happy to report that I have accepted an offer to join
the Nerdery as a Software Engineer. I’m looking forward to working with this group of people who are way smarter than I am, and for the chance to grow with this unique company.
Additionally, I’ve been reflecting on the overall job search experience. I am very aware of how lucky I am to have rubbed shoulders with many insanely smart and experienced professionals. I really appreciate the time everyone took to meet with me. Life is good.
It’s been seven years since I was last looking for a job, so I’m not sure if this is a new trend or not: Over half the companies I’ve started the interview process with have given me time-unlimited full-stack coding challenges as part of the interview process. I just completed my third one in two weeks and I thought I’d share my thoughts about this practice in general.
Here are highly condensed versions of the requirements for each challenge:
- Create an ASP.NET MVC application that consists of one page that allows users to add, remove, and rate movies. Movie ratings are to be indicated with a star rating control similar to what Netflix uses.
Create an ASP.NET MVC website that allows users to upload and view images. If the image is more than 500 pixels wide, or 700 pixels tall, downsize the image to the maximum allowed size, but maintain the aspect ratio. All images should be stored on the server, including the original image when images are resized.
Create an ASP.NET application (Webforms or MVC) that allows users to add and vote on Xbox game titles for the company break room. Users can only vote once a day, and they cannot vote on the weekends. Any user can mark a game as owned, in which case it ends up on a list of owned games. The titles and votes will be stored / retrieved via a set of WCF services that are provided for you. In addition to sending the code in for review, submit a URL to a running copy of the application.
What I Liked About the Process
First off, I really liked that these companies wanted to see code. I think far too many places hire developers without seeing a line of code. True, you can determine what somebody is capable of through conversation (especially at senior / expert levels), but I think it’s a pretty big risk. Some people are really good at talking, and / or look good on paper when in truth they don’t know what they are doing.
Secondly, I really enjoyed doing these challenges. They’ve helped me from getting rusty during this period of unemployment. Additionally, I had to learn at least one thing for each of the applications. For company A’s challenge, I had to learn how to create a widget that used JSONP. For company B’s challenge I had to learn to resize an image while keeping its aspect ratio using only the core .NET framework. For company C’s challenge, I learned how to use the MVCContrib Grid and the jquery.dataTable plugin. I also learned and used AppHarbor to host the application.
What I didn’t Like or was Uncomfortable With During the Process
Finding time to do all three of these within a couple of weeks was a little rough. I did find the time, but part of me was wondering if the companies were wondering why I hadn’t turned the challenge in yet. In reality they didn’t know that I was working on three of them simultaneously. This is something I probably should have communicated to them.
The thing I was most uncomfortable with was that I felt like I had more than enough rope to hang myself with in terms of doing things that weren’t wrong per say, but that the reviewer might frown upon due to personal taste. One thing that I was really torn on was how C# 2/3/4 idiomatic I should make my code. Taking advantage of generics, lambdas, linq, anonymous types, etc… could cause the code to look like gibberish to the reviewer if he or she is still writing code in C#1 style. On the flip side, I felt that I would look like I was stuck in the past if I didn’t write modern C#. I’m not sure that finding the middle ground is the right thing to do in this case either as it could look inconsistent.
Along the same lines, I found it very difficult to decide on the right level of architectural complexity to use. These apps were all a bit more than trivial, so I could have gone either way with the complexity. As with idiomatic C#, I found myself wondering if the reviewer was going to think I was oversimplifying or overcomplicating things. Additionally the reviewer may actually want me to overcomplicated things a bit to show what architectural patterns I know.
These two issues are things that usually get sorted out when you hash over ideas with people in person, but I found it difficult to know who my audience was when given nothing but a set of requirements.
All-in-all, I thought it was a positive and enjoyable experience. If I’m ever in the position to help with hiring again, I would probably really push hard for code samples to be part of the process. However, I probably wouldn’t give such open ended challenges. I’d probably give multiple smaller and more focused problems.
This post is an attempt to capture how my previous team dealt with dependency / package management. The team, at its largest point, consisted of about 15 developers. There were roughly 200 3rd party dlls, and roughly 150 internal dlls in the dependency mix. No single app needed all 350 dlls, but groups of these dlls were common to applications in the same domain space of the enterprise. 3rd party dlls were things such as the MS Enterprise Library, image conversion libraries, desktop scanner interop libraries, etc… Internal dlls were things such as common utility libraries, WCF service contracts, DTOs etc…
Until the recent development of projects such as Nuget and OpenWrap, dependency management in .NET has been a big problem. The larger a development group’s topology is, the more of a pain point it is. Because this is the case, a lot of teams don’t even realize they are going to have issues until the pain is upon them. Additionally, I think the complexity of describing the problem has helped to keep package management as an elephant in the .NET room for a long time.
Since there has been abundant discussion on this issue lately, I’m going to skip describing the problems with dependency management and get straight into what we tried over the years, and what the final solution ended up being. It’s important to note that my team used TFS (2008) for source control because some of the steps we took were in response to TFS’ weaknesses.
First attempt ending in Failure:
- Manage all 3rd party dlls by putting them in a [sln]/bin folder (tracked by TFS).
- Manage all internal dependencies as shared projects across multiple slns.
Second attempt ending in failure:
- Manage all 3rd party and internal dlls by putting them in a [sln]/bin folder.
Third attempt ending in Success:
- Build a custom package management system.
This solution isn't perfect. Namely, it relies on developers to remember to run the executable from time to time. Additionally, there are potentially some versioning scenarios that could occur between packages that expect a different versions of sub dependencies. However that issue never manifested itself in our environment.
Nuget is not the final answer for teams using TFS
I've been following the Nuget dev list closely, and Nuget is considered to be a development time dependency resolver only, not a build time resolver. This means that if you use TFS to track your Nuget package folders, you could still run into dll versioning issues.
Nuget team member David Ebbo has blogged that functionality has been added to allow use of Nuget without committing the packages folder to source control.http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html
According to Phil Haack, Nuget is going to be getting official support for non-committed packages: http://haacked.com/archive/2011/04/27/feedback-request-for-using-nuget-without-committing-packages.aspx
I know that many of us have had to face this problem, therefore I'd really enjoy hearing about how you addressed this issue.