Using the EntityAdapter for Azure Table Storage
I got a request for an example of how to use the EntityAdapter class I previously posted about. Here is an example of a PersonAdapter.
I got a request for an example of how to use the EntityAdapter class I previously posted about. Here is an example of a PersonAdapter.
I recently posted about an EntityAdapter class that can be the bridge between an ITableEntity that Azure table services requires and a domain model class that you actually want to use. I found an issue with this implementation where TableEntity.ReadUserObject and TableEntity.WriteUserObject that the EntityAdapter rely on will only support mapping properties for types that are intrinsically supported by ATS. This means your domain model will end up with default values for properties that are not String, Binary, Boolean, DateTime, DateTimeOffset, Double, Guid, Int32 or Int64.
I hit this issue because I started working with a model class that exposes an enum property. The integration tests failed because the read of the entity using the adapter returned the default enum value for the property rather than the one I attempted to write to the table. I have updated the EntityAdapter class to cater for this by using reflection and type converters to fill in the gaps.
I’ve been running this check in procedure for several years with my development teams. The intention here is for developers to get their code into an acceptable state before submitting it to source control. It attempts to avoid some classic bad habits around source control, such as:
I’ve finally gotten around to adding some reg files for using WinMerge with VS2013. You can download them from the bottom of my Using WinMerge with TFS post. These reg files will configure VS2013 to use WinMerge for TFS diff/merge operations (no Visual Studio restart is required).
When working with Azure Table Storage you will ultimately have to deal with ITableEntity. My solution to date has been to create a class that derives from my model class and then implement ITableEntity. This derived class can them provide the plumbing for table storage while allowing the layer to return the correct model type.
The problem here is that ITableEntity is still leaking outside of the Azure DAL even though it is represented as the expected type. While I don’t like my classes leaking knowledge inappropriately to higher layers I also don’t like plumbing logic that converts between two model classes that are logically the same (although tools like AutoMapper do take some of this pain away).
Writing records to Azure Table Storage in batches is handy when you are writing a lot of records because it reduces the transaction cost. There are restrictions however. The batch must:
Writing batches is easy, even adhering to the above rules. The problem however is that it can start to result in a lot of boilerplate style code. I created a batch writer class to abstract this logic away.
My upgrade pain with VS2013 and the Azure SDK 2.2 continues. Hosted build now fails with the following error:
The task factory "CodeTaskFactory" could not be loaded from the assembly "C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Build.Tasks.v4.0.dll". Could not load file or assembly 'file:///C:\Program Files (x86)\MSBuild\12.0\bin\amd64\Microsoft.Build.Tasks.v4.0.dll' or one of its dependencies. The system cannot find the file specified.
While my Polish is non-existent, the answer can be found at http://www.benedykt.net/2013/10/10/the-task-factory-codetaskfactory-could-not-be-loaded-from-the-assembly/. The project templates for the web and worker role projects uses ToolsVersion=”12.0”. This needs to be changed to ToolsVersion=”4.0” for hosted build to be successful.
I hit this one a couple of days ago and it had me scratching my head for a while.
I thought it was an issue with the tooling, perhaps something I uninstalled or installed. I had installed VS2013 with Azure SDK 2.1, then updated with 2.2 when it came out but I had also uninstalled some packages related to VS2010 which I have used for years.
Turns out that this error presents itself when the solution doesn’t have something to debug. The message is a little misleading though.
My solution starts multiple projects on F5. One project is an Azure cloud project with web and worker roles (debugger attached) while the other is a local STS website (no debugger attached), all of which run in IIS Express. This error popped up when there were either no projects set to run or when the STS project was set to launch without the debugger and the cloud project was set to None for the multiple project start. Either of these cases causes VS not to debug because there is nothing that is configured for it to attach too.
I have upgraded a decent sized solution to Azure SDK 2.2, VS2013 and most of the latest bits (still MVC4 though). All of a sudden the web role is not serving any content. In fact the web role can’t even start and the Application_Start method never gets invoked. IIS Express 8.0 only renders the following content:
The page cannot be displayed because an internal server error has occurred.
I’m super excited to announce the first release of Headless, a HTTP acceptance testing framework. It makes GET and POST calls and provides a framework for interacting with HTML pages. Oh, and it is super fast.
I have completed converting the acceptance test suite an enterprise web application from WatiN to Headless. The test suite has around 360 tests of which only about 6 could not be converted to Headless. The execution time of the test suite dropped from 1:04 hours down to just over 7 minutes. Yeah, it’s fast.