Tuesday, March 08, 2011

Adding the sequence number to a LINQ query

LINQ queries are a powerful way to keep your code expressive and by using differed execution, they are quick. But what if you need the value of the index which LINQ used whilst building the projection? I had this issue and found the solution was to use the Select method which accepts a Func<TSource, int, TResult> for the selector.

With a for loop this is simple to accomplish as the index value is available in each iteration as it is controlling the loop:
This code prints this to the console:


Creating a LINQ query with the indexer in the projection is not as obvious. My first attempt was to use the Count() property of the line parameter in the projection.


But when run to the console the problem became apparent


In the projection, line.Count() is returning the string length of each line in the array. First attempts are always a good way to discover how something could work

Fortunately the LINQ Select Method has two overloads. They both iterate over an IEnumerable<T> but it is the delegate for the selector which differs. The code above uses the first delegate which should be Func<TSource, TResult>. The second overload expects a Func<TSource, int, TResult>. Here, the parameter used for the int32 is assigned the current value for the index of the sequence.

The first code example can now be changed to something more expressive


Running this code displays the following in the console


Many of the LINQ methods have this overload for the selector. Using it I have been able to continue using LINQ to specify how I want to transform the array. This means my code is more expressive. Plus, the LINQ query is quick as the projection will only be populated when the foreach loop is executed to display the result.

Saturday, February 12, 2011

IIS application pool and domain indentites

Follow these steps to specify and non standard identity for an IIS application pool. For this example I will use the account domain\WebUser
  1. In Administrative tools open the Local Security Policy program. And find the Log on as service policy in Local Policies, User Rights Assignment. Click properties and add the user domain\WebUser
  2. Open Windows explorer and go to C:\Windows\Temp. Open the Sharing and Security and add the user to the security tab. Grant the user enough rights to read and write files
  3. Open a command prompt and change to c:\windows\microsoft.net\Framework\v2.0.50722. Run aspnet_regiis.exe -GA domain\WebUser
  4. In IIS open the properties of the application pool and go to the identity tab. Click Configurable and enter the username and password.

Monday, January 10, 2011

IIS 6 and the HTTP 401.3 error

I love it when I find a new tool to use, I love it even more when it is really useful and saves me hours of work. Recently, I had the opportunity to try out ProcMon. This is what happened.

Our test web server started returning HTTP 401.3 errors. The cause was quick to find; the permissions on the root website folder had been changed and the IIS accounts were missing. So the fix appeared simple, re-apply the permissions and they will cascade all the way down the tree. I added the local IUSER account but it failed to fix the problem. I spent several hours with the MSDN documents to make sure I had the correct users and groups applied, but to no avail. I could not find a way to return the server to normal operation.

Finding the problem

The next day I felt resolved to find the problem, no more hacking around throwing users at a dialog box. For help I turned to Process Monitor (ProcMon), part of the SysInternals suite of tools. ProcMon is a superb tool for these situations. It collects all activity on the machine, showing a list file, registry and network activity. Importantly for me, it also records the result of the operation.

I fired it up, attempted to load a web page from my browser, and then stopped the trace. Tracing all the activity on a server will produce a metric ton of data; a one minute trace on my PC generates ~300,000 events. For this reason ProcMon has good filtering. You can pick from a list of events and limit by a text value. I chose to filter the list by Result, only showing those which returned ACCESS DENIED.



With the filter applied there was only one event in the list; the IUSER account was trying to access the file from my browser request. Upon checking permissions on the actual file I found that they were different to the parent. All of the IIS accounts were missing. I forced the permissions down the tree and IIS started serving pages again.

Not just any tool but the right tool

ProcMon is the star here, without it I would have found the problem but with a lot of guess work and a great deal of time. With ProcMon I could see exactly what was happening when IIS tried to serve the page. Being able to see what happens at the core of a system is essential to fault finding and having the right tool is infinitely time saving.

Tuesday, September 21, 2010

NAnt and the .Net 4 error

NAnt threw this odd error today.

The reason for it being odd was the version of the .Net framework being used in the call: .Net 4.0, instead of 3.5.  While I have 4.0 installed, I was trying to build a 3.5 project.

A little searching and I soon had a good guide of how to build 4.0 projects using NAnt, but this wasn't the solution to my problem.

Why 4.0?

Now it is possible to tell NAnt which framework to use, the -t: switch from the command line or in the main NAnt.exe.config file. I tried, and failed. NAnt threw the same error.

I wanted to stop NAnt using .Net 4.0, but how? Well, near the bottom of the config file is a list of supported Frameworks, of which 4.0 is one.

Removing the entry for 4.0 fixed the problem. Certainly this is a quick fix as NAnt has a problem with .Net 4.0 on my PC, but not a problem to be fixed today.

Update

Thanks to Rawdon for mentioning the path to the configuration file. You can find this in %InstallLocation%\NAnt\bin\NAnt.exe.config

Also from the comments. Paul Stewart has found the cause of the problem and written it up in this post.

Monday, September 13, 2010

TFS and MSBuild properties

My current client uses TFS as build server and MSBuild for deployment scripts. Whenever I have to change the scripts I spend time searching for the various properties which they expose. So this is my reminder, to kick start the process next time.

MSBuild

There are many different ways of using MSBuild but the list of Reserved Properties is always a good start. An example property is “$MSBuildProjectFile”, which returns the directory where the project file is located. From here I can often navigate using relative paths to the various places I need to go.

Any talk of MSBuild would not be complete without mentioning these extension libraries; SDC Tasks and MSBuildTasks. They provide a hosts of extras from re-writing XML files to creating web sites.

TFS

The build process (Team Build) is built upon MSBuild and, like all processes which are managed with MSBuild, it provides a set of extensions for querying the environment and managing the process. The first port of call is the MSDN page. After reading through this which is lengthy, I would recommend having a look at Martin Woodward’s post; useful Team Build properties.

Sunday, May 16, 2010

Picasa 3 OS X and a shared image database

At home I use Macs and for viewing photos I use Picasa. It is the best tool I have found for the task as; it is happy to let you choose where to put the images and if I make corrections it doesn't change the originals.

Central library

One annoyance though is that each computer has it's own library locally, so when I add new pictures both libraries have to scan the watched folders.  So I spent an hour hacking around to see if I can move the local database to a share on my home server.

Symlinks

Picasa 3 keeps the image database in:


I found the path after looking in the Preferences -> Network page.  Now I moved in to a terminal window to;
  • Create a PicasaDb on my home server
  • Copy the local database to the folder on the share
  • Rename my original database
  • Create a symlink to the database now living on the server share


With all this in place I fired up Picasa and it still worked, first hurdle over.  I then edited an image and still no problems. I check the timestamps on the database files on the server and they had been updated plus, Picasa had not created a new database locally.

Next time I will attempt the same on the other computer to find out if Picasa will happily share a database.

Wednesday, January 06, 2010

Reset Remote Desktop Connections

Being able to connect to the desktop of another server is an essential part of most developers working day. Whether it is to configure IIS or to kick of a deployment, starting up the Remote Desktop Connection client is often the quickest way to complete the task.

Unfortunately the basic configuration only allows for three connections at any one time. Also, if someone just closes the client, their connection is not cleared. It will hang around in a disconnected state until someone connects to the physical machine to clear any unused connections.

There are two DOS commands which solve this problem. The oddly named; QWINSTA and RWINSTA.
qwinsta /server:SuperServer
The displayed list will include the session Id which you can use with the next command, rwinsta, to reset the sessions. Type the following to reset session 1 on SuperServer:
rwinsta 1 /server:SuperServer
Be sure to pick sessions with the state “Disc” as connections marked as “Active” my really be active.

Often the simplest tools yield the biggest gains. The discovery of these two commands has saved many hours of work. I no longer have to go through the IT support process and have an engineer go to the physical machine.

Tuesday, December 01, 2009

PowerShell and the event log

One of the strengths of PowerShell is the easy access to WMI it provides at the command line. Before PowerShell, accessing WMI involved doing all the work from within VBScript and processing the results using the facilities available in the scripting language. PowerShell on the other hand is a built on top of the .Net framework so the manipulation of the results is far easier. I now find myself stepping away from the desktop and opening the console for a lot more tasks, I always believe that you should tell the machine what you want it to do rather than doing it yourself.

To demonstrate this the code example below will
  • Query the application event log of a remote server
  • Order the log entries by the date they occurred
  • Return the first 5 results from the set

The cmdlet Get-WmiObject is the gateway to WMI and allowed me to complete the first step with this simple command

As the results from the WMI query are stored in an array, I’m now free to manipulate the result set further using the commands available in PowerShell. Completing items two and three on my list only requires this command


The big win here is being able to run a query on a remote server but manipulate the result set on my local machine. WMI has a large set of providers which are now only a query away from my console.

Tuesday, November 03, 2009

ASP.Net and the disappearing Session value

Sometimes errors occur that really have me scratching my head and they are usually occur when I move from my machine to the test environment.
While running through some changes I’d made to a form the behaviour of HttpSessionState became erratic, randomly returning either null or the value I had set. I fired up DebugView and captured this trace:



Here, three different threads are shown  serving the page requests and, as I would expect, only one thread is returning the value I had set.
This sounded like a configuration issue so I had a look in IIS Manager and found the application pool was configured to be a Web Garden



This now made sense as my form was using InProc sessions handling and Web Gardens behave like a single machine Web Farm, requiring a StateServer for session handling.
The solution to my problem was simple, set the value to 1 as I don’t need the resilience of a Web Garden.  If you do though Nicholas Piasecki, has a great write up here.

Friday, October 16, 2009

Running Selenium tests in Visual Studio

Troublesome testing

I found the amount of time I spent manually testing forms as part of my development process painful and thought there must be a better way of doing this task. The forms are tightly coupled to the database so any refactoring becomes a risky process due to the lack of unit tests. One answer I found that helped me is the http://seleniumhq.org/ web testing framework from ThoughtWorks.

You can download the example code for this post here: http://keithbloom.s3.amazonaws.com/Selenium.Example.zip The only requirement is that you have Java or greater on your PC.

The framework is built around a language called Selenese which executes actions in a web browser. There is a client library available for .Net, so the tests can be run in NUnit or any similar tool. For the NUnit integration to work you have to be running the Selenium RC web server to host the test and a web server for the ASP.Net page being tested.

This could be a lot of work upfront just to run some tests. So I have written some helpers to configure the environment automatically.

Web server

The development web server which ships with Visual Studio (once called Cassini) is a perfect tool for hosting the ASP.Net page under test. An instance is started for the test suite with a hard coded port of 8085 and the URL is set to localhost. As all the projects live at the same level in my project tree I am able to hard code the path as well.

Selenium RC

The process that hosts the tests is a Java based application called Selenium RC. When a test is run in NUnit, the Selenium client library sends Selenese commands to this process over HTTP. When the test suite first runs, I start a Java process and point it to the Selenium RC jar file, this will only get closed when all the tests are finished.

Selenium Runner

The helpers are combined by an abstract class which starts the Web Server and Selenium RC process so any test class that derives from this will automatically have the testing environment configured. The only data required by implementors is the name of the project under test:

Speedy

I have been using this for a week now. At first just as a means to run through the forms quickly. However, I have found other uses for it. I have started adding asserts for common scenarios. One Selenium command that is proving useful to me is selenium.GetHtmlSource() which returns a string of the full source code. This is enabling me to run a test and see the source appear in the Output window and then to check for the presence of certain items. I have used this method the check that certain Omniture tags are being generated:

I am happy with the addition of UI testing to my development process. Here it helps to deal with legacy code which requires some work. It is now possible for me to start refactoring which is something that can only be done with this kind of test coverage.

Friday, September 25, 2009

Viewing PreExecute script errors in RedDot CMS

One problem I have always encountered with RedDot is debugging the pre-executed script blocks.  Fortunately Gavin Cope came up with this superb solution which also has the benefit of moving the pre-executed files out of the main RedDot folder.

Modifications

I have added to Gavin's work by:
  1. Writing the error to a HTML file so it can be viewed easily in the browser
  2. Adding a page with a list of all the error files in the logs folder

To create a HTML file for the error change the call to WriteToFile with the code below. Note, I have also changed the path for the PreExecute folder.


For the list of errors create a new file in the PreExecute folder called ErrorList.asp and paste in the following script


Again, thanks to Gavin for a great bit of lateral thinking which make template development much simpler.

Now if someone can create a plug-in so we can create templates in something other than a web browser, that would be superb.

Wednesday, June 24, 2009

Powershell: Delete files containing a string

This is a handy PowerShell snippet. The first line creates a list of files containing a specified string, while the second line will delete each item in the list. For a sanity check, put -whatif after the rm command to see what will happen.


For my first attempt I tried to do the whole operation in one go but PowerShell threw an error. It took a little while for me to realise that the select-string command still had the file open which caused the delete to fail. Putting the list of files in to a variable fixed this quickly.

Tuesday, May 19, 2009

3 reasons for using PowerShell

I wish to learn PowerShell as Windows finally has a great shell making it possibile to automate tasks that are long winded using the GUI.

Here are three very simple task which were hard to achieve using the MS-DOS shell.

Connect to UNC paths

To browse a network share in cmd.exe you first had to map a drive. With PowerShell you can navigate to server shares as you would a local drive.

Easy to recurse folders and files

While it is possible to recursively delete files in cmd.exe, PowerShell has built in support for the ** operator making operations on sets of files very simple. For an example, the following will delete all files with the extension .bak from the C:\temp and all folders within it.

Controlling remote servers

Often I have to administer a service on a remote machine, a recent example is starting the W3SVC on a server which kept failing. As PowerShell has access to the Windows Management Interface API I can run this command instead of using IIS manager over Remote Desktop


In fact PowerShell finally provides a useful wrapper around those rich but hard to get at WMI APIs. The following script will enumerate all websites and display the path to their log file

Tuesday, May 05, 2009

Starting VNC Server remotely on OS X

I have a Mac Mini running as a small home server. So small in fact it runs headless, no keyboard, no mouse and no screen. This is fine as I can control it using VNC but every now and then it reboots and the VNC Server doesn't start at boot time.

So this script can be run from and SSH connection to get the server backup and running.
/System/Library/CoreServices/RemoteManagement/ARDAgent.app/Contents/Resources/kickstart -agent -restart

Thursday, March 26, 2009

Exclusion list for Tortoise SVN

I use ReSharper, an amazing extension to Visual Studio and one which seems to be on most developers default tool list. I have now started using Subversion and Tortoise for some personal projects which means I have to manage my own file exclusion lists, specifically ReSharper which quickly turns a 100KB project in to a 4MB one.

So here is my TortoiseSVN global ignore pattern:rn:
_ReSharper*
*.resharper
*.dll
*.exe
*.ncb
*.o
*.obj
*.pdb
*.projdata
*.pyc
*.scc
*.suo
*.user
*.vspscc
*.webinfo
*proj.user
bin
obj
temp
tmp
user.config
With this set on all clients any files matching this pattern will have to be manually added to the repository.

Wednesday, February 25, 2009

NullReferenceException when using Rhino Mocks

While writing some tests for the following method I had a strange problem. When the test ran it threw a NullReferenceExceptions for my stubbed User object.

The method under test is simple enough


as is the test case


The mocking Framework is Rhino Mocks and this was the solution to the problem.

With Rhino a call to a stub (or mock) is expected only once while my method is checking the user object a number of times. Eventually the test will just check that the cookie has been set so I can change the User stub to:

Tuesday, February 24, 2009

Viewing .Net Debug messages

Back when I wrote a lot of ASP my best debugging tool was Response.Write and like most people I had functions to help me handle all the information. Back to the present day and the .Net framework moved on from echoing information to the response stream but every now this method still has its uses. Take my recent example of writing a HttpModule, this code runs inside the .Net pipepline and fires various events. When I deployed it to a test server I wanted to find out when events fired and what isntances were initiated. To help me .Net has the System.Diagnostics namespace so I can write:
Debug.WriteLine("Post Authorize event has fired");
Great, but how do I view all this debugging? I could have logged it, but I wanted a nice simple solution so I used DebugView, part of the SysInternals suite.

This clever program will display anything written to the DefaultTraceListener, plus it can connect to a remote computer.

DebugView

Friday, February 20, 2009

Using MEF to extend a HttpModule

I watched Glen Blocks PDC demo for the Managed Extensibility Framework (MEF) last year and thought it would be a great tool to use if I ever created a client application that had to provide a plug-in mechanism. As I write web based business applications I didn’t think this would be any time soon. Added to this I am using the Castle Windsor Inversion of Control container so I didn’t think I had a need for another container.

Background

I’m currently working on a project that has a central application for user authentication. Each client application calls a webservice to check the user’s credentials and, if all is well, a principal is added to the HttpContext. This process is all wrapped up in a HttpModule. My task is to create a variety of cookies to authenticate users with 3rd party or legacy systems once they are authenticated. The best place to do this is within the HttpModule but this is used by a variety of applications. To solve this I used MEF to create a pluggable PostAuthorization mechanism.

MEF and the HttpModule

My goal is to be able to write a component that can set a client cookie. As it is dealing with user information is will also have to access the user stored in the HttpContext.  I created this interface to start with


MEF creates dependencies at runtime whereas an IoC usually has a configuration process which defines the concrete instance to be created for the abstract services. In my httpModule I created a list of CookieSetters that MEF could populate:


By attributing the collection with [Import] I'm telling MEF to wire up all assemblies it discovers that implement ISetCookies. Now this is in place it is time to start up MEF, and this being a HttpModule the place to do the work is in the Init method


The creation of the container, which I lifted from the CodePlex site, creates:
  • A catalog of all the assemblies in the bin folder, anything attributed with an Export will be added to the catalog.
  • A composition batch that will consume the exported assemblies, anything attributed with Import will be added to the batch.
  • A container to hold it all and orchestrate the magic.

Now I can create my PostAuthroization event to kick off any CookieSetters MEF has found:


Here I first check that the list has been created and then call the SetCookie method on each one. Below is a trivial example of a CookieSetter


Summary

When working with different teams being able to provide simple extension points is great and I can see MEF being the tool for the job. It will also be a good tool to have alongside an IoC as the two are complimentry.

As for the code above it solves a problem for me but still needs some work as the container will be built everytime the module is loaded causing a nasty performance hit. Putting in the HttApplicationState as described by Michael Puleio will be the best solution.

Thursday, February 12, 2009

Quickstart: Blueprint CSS

Blueprint is a great CSS framework making grid based layout simple.

Getting started

  1. Download the latest version http://github.com/joshuaclayton/blueprint-css/zipball/master and extract the ZIP
  2. The download is large but only the blueprint folder is required, click through the folders until you find it and copy it.
  3. Now go to the website you’re creating and put blueprint in the css folder.
  4. Everything is set to so you can start to follow the Blueprint quick start tutorial

Next steps

When you have finished with the quick start, read through the articles in the Wiki.

Wednesday, June 04, 2008

Open Social: Google's social networking API

Facebook recently released an API allowing developers to access the data they hold about their social network. One of my main concerns was developing for a closed system. It would appear that others thought likewise as Google are releasing a new API called Open Social. Applications developed to use Open Social will gain access to a similar set of data that Facebook provides, namely:
  • Profile Information (Name, education, etc.)
  • Friends Information (The "Social Graph")
  • Activities performed (A feed of information, e.g, "Keith joined the .Net Group")

Yet the application will work on any site that supports Open Social (potentially this could include Facebook). Currently the list includes; MySpace, Plaxo, LinkedIn, Bebo and Salesforce.com to name a few. We all have identities on many different sites but with a standard API for personal data all systems would be able to share that information. Take for example a small CRM for communicating with customers, let's call it Inboxy and a fictional customer, Niyati. Inboxy could be extended to consume and release data using Open Social, effectively turning it in to a mini social network. Niyati could then allow Inboxy to access her social data. Inboxy will then have access to all Niyati's data and be able to contribute to it. Here are a few scenarios to consider:
  • Niyati changes her date of birth on MySpace. Via the Activities feed Inboxy will know this has happened and be able to act on it.
  • I develop a pluggable application showing the place and date that Niyati will be studying a course. With Open Social this could appear on MySpace or Bebo without modification.
  • When Niyati has confirmed her attendance, Inboxy adds a message to her Activities feed which would be seen by all her friends on a variety of networks, e.g. "Niyati is off to San Diego on 13th February"

These are rough ideas of what maybe possible and we won't know until The API is launched [http://code.google.com/apis/opensocial]. This must also be balanced with issues of; data protection, would it be legal to expose customers data in such a way and application overhead, how much work would be involved in creating and maintaining such systems.