Functional Programming in F#

As a foray into the world of functional programming, I decided to spend 2 days learning as much as I could about it. This was solely a learning exercise, with no view towards creating any kind of application.

The reason I chose F# above all others was its easy integration into Visual Studio 2010 and its extensive online support.

I used the latest version of F# which can be downloaded here

I used the msdn wiki documentation which can be found here

Test project on github can be accessed here

Day One

I wanted Day 1 to be spent simply learning, and writing as little code as possible.

Real-World functional programming

I decided to spend the morning reading the first 2 chapters of Tomas Petricek and Jon Skeets “Real-World Functional Programming” which gives examples in both C# and F# and is a good intro into functional progamming for someone who has come from a C# background. It quickly gets going,

Immutability

F# is by default an immutable language. Once a value (or a symbol) has been set using the “let” keyword, it’s value cannot be changed. If you wish to specify a new value you have to create a new, or shadow copy of it. This means that in order to perform the following C# task

Ellipse ellipse = new Ellipse(new Rectangle(0, 0, 100, 100));
Rectangle boundingBox = ellipse.BoundingBox;
boundingBox.Inflate(10, 10);
return ellipse;
you would need to do the equivalent of the following in F#
Ellipse ellipse = new Ellipse(new Rectangle(0, 0, 100, 100));
Rectangle boundingBox = ellipse.BoundingBox;
Rectangle smallerBox = boundingBox.Inflate(10, 10);
return new Ellipse(smallerBox);

Note how this forces you to new up a seperate instance (line 3 and 4) at each step. This immutability by default is one of the reasons that F# is touted as a good choice for multithreading.

You can specify a symbol to be mutable though, using the mutable keyword, again with the emphasis on making F# a more transitional functional language.

What F# can offer a C# developer

F# and therefore .NET offer a great bridge between the OOP style of C# and the functional paradigm. In 1.3.5 the authors state that you don’t have to make changes right away. You can “often use direct equivalents of C# constructs while you’re getting your feet wet.”

  • Functional programs in .NET can use OO design to structure methods.
  • You can still use all the other .NET libraries
  • Biggest impact of functional programming is at the lower level, i.e. algorithms and behaviour of the application. For example, replacing complex nested loops.
  • Immutability by design makes it concurrency friendly
  • Rapid prototyping using existing .NET libraries

Syntax

Whilst there are a number of similarities in syntax between F# and C#, there are a few important keyword differences:

  • module - a way of seperating logic – F# modules are compiled as classes which only contain static members, values, and type definitions.
  • type - the equivalent of a c# class
  • let - sets up a “symbol” to have a specific value, this could be setting up the equivalent of a C# variable, or even a method where the parameters are defined
  • member - F# equivalent of a C# property
  • |> - the pipeline operator, which allows you to chain method results together. A good example of which can be seen here in this implementation of the Euler 25 puzzle:
type Fib() =
 static member GetFirstWithLength(number) =
 Seq.unfold(fun (current,next) -> Some(current, (next, current + next)))(0I,1I)
 |> Seq.takeWhile(fun n -> n.ToString().Length < number)
 |> Seq.length
[<Test>] member x.
 ``Euler 25 : when I ask for the first with length of 1000`` () =
    Fib.GetFirstWithLength(1000) |> should equal 4782

Type inference

F# infers type in a much more sophisticated way than C#. It is able to work it out from the bottom up. For example, with the following code:

let add x y = x + y

Rolling over the add symbol in Visual studio will show you that add is made up of

val add: (int -> int -> int)

This means that this is a function where you pass in an int, to get another int, which you finally pass in to get the final int. F# only knows that x and y are ints because of the + operator. In contrast:

let removeY x y = x
Cannot infer type, so rolling over will show:
val add: ('a -> 'b -> 'a)

Meaning that the function will ultimately return ‘a which is in simple terms the F# equivalent of a generic type.

Currying

This leads us on to another important feature of F#, the ability to “curry” methods. This means you are able to fire a method in stages by not necessarily satisfying all parameters at once.

let add x y = x+y
// shows a partial method call - we're only passing 1 param
let addFive = add 5  
[<Test>] member x.
 ``Should add 5 because we have already passed that param in another method``()
   = Assert.That(addFive 12, Is.EqualTo(17))

F# Koans project

In the afternoon I decided to go through the F# Koans project which our very own Chris has blogged about here. They were pretty straightforward and do help take you through all of the key concepts albeit in a very low level of detail.
They deal with the following concepts:

Summary

By the end of the day, although my head hurt a bit, I really felt like I had made some progress, and felt ready to have a tinker with actually getting some working code up and running.

Day Two

Today I wanted to get some real world examples up and running. I had been using F# interactive which I found a little buggy and left me out of my comfort zone, so I figured I would need the ability to run unit tests in order to get anywhere.

MSDN wiki

I decided that going straight to the WikiBooks F# pages would be a better way of learning quickly. I would advise this as the first port of call for anyone wishing to learn F#, it’s excellent.

Setting up unit testing

Setting up unit testing was proving a little tricky….

FSUnit

I first tried to get FSunit working, which is a small script that, in conjunction with NUnit.framework, allows you a more F# like way of running tests. The way I managed to get this working was to create an F# FSUnit project with one FSUnit module in it. I was then able to reference this project in my test library.

I quickly realised that all FSUnit really did was to replace the normal NUnit calls with a more F# friendly call. I had always been able to run my Unit Tests using my normal test runner (ReSharper) and the nunit.framework.

F# Attributes are denoted using [<TestFixture>] e.g:

[<TestFixture>]
type ``Given a request to 7digital API artist details`` () =
 let req = new SevenDigital.Api()
 let xname sname = XName.Get sname
 let xattr (elem: XElement) sname = elem.Attribute(xname sname).Value
[<Test>] member x.
 ``When I send the request for id 1`` () =
    req.GetArtistReleasesXml() |> should not (equal EmptyString)

The tests in full are here:

Real world examples

Stock Analyzer

https://github.com/7digital/fsharp-hacking/blob/master/fsharp/StockAnalyzer.fs

The first thing I built was a StockAnalyzer that pulls stocks from csv file formats downloadable from yahoo. This was built around a demo from an F# video I’d watched.

It’s worth noting that it uses the normal .NET System.Net.WebClient library, which is referenced and accessed in almost exactly the same way is you would in C#.

You can see some F# pipelining in action here:

let prices =
 csv.Split([|'\n'|])
 |> Seq.skip 1 // NOTE: skip the titles
 |> Seq.map (fun line -> line.Split([|','|])) // NOTE: each line is a string
 |> Seq.filter (fun values -> values |> Seq.length = 7) // NOTE: Make sure your only getting 7 values
 |> Seq.map(fun values -> System.DateTime.Parse(values.[0]), float values.[6]) // NOTE return a tuple of DateTime and float

Simply put, this uses Seq (an F# sequence produced from using the Split function) and runs each stage of the function recursively, passing the results of one function to the next in the pipeline. (fun is shorthand for function). It’s similar to how Linq and IEnumerables work in C#.

The async{} wrapper allows you to run that piece of logic as an Asnyc call, which returns an ASync object. This enabled me to use ASync.RunSynchronously in the GenAnalyzers method. You could achieve the same without it but each call to a ticker would have to wait until the previous one was done. You can also see the static keyword being used, which has exactly the same meaning as the static in C#.

The StdDev member works out the standard deviation and demonstrates the use of some of the other Seq object methods, which are incredibly useful. Pairwise for example takes the nth and n+1 item in a sequence and returns them as a list of 2 tuples as defined when you roll over it:

val pairwise : seq <'T> -> seq<'T * 'T>

Also the c# TestLibrary project demonstrates an example of how you can reference and include a compiled F# type in a C# project.

Note how the t tuples returned from the artistReleases are given the default names of Item1 and Item2.

Consuming 7dig Api (artist/releases)

https://github.com/7digital/fsharp-hacking/blob/master/fsharp/SevenDigitalApi.fs

I also had a quick go at consuming an 7digital api endpoint, using LinqToXml to traverse the results and populate a tuple of int (containing the Id) and XElement. I would have liked to have had a play with manipulating thes xml data in a functional way, but ran out of time.

Summary

This was a very interesting couple of days. I learnt a lot over a short space of time, and it helped me to understand a little more about the origins of C# Linq and IEnumerable.

At the moment, I still don’t fully see any real killer situations (other than the obvious rapid prototyping, or concurrency) where we should be using F# instead of C#.

This leads me to the conclusion that I need to investigate further as I may not have fully understood the potential of F# as a functional language, or even (heaven forbid!!) the role of functional languages themselves.

Links

Test Project on github
Real-World Functional Programming book
Wikibooks F#
F# Koans
Microsoft F# download
Chris’s blog

Posted in Software Development | Tagged | Leave a comment

OAuth in OpenRasta

Introduction

One of the things that was on my to do list after my previous attempt to implement a service a-la REST in Practice was Chapter 9 Web Security – The OAuth Protocol.

OAuth is something I have had very little experience with and as we use it extensively here for the API, I thought it would be a good opportunity to attempt to integrate it into OpenRasta.

I’d also seen a few posts on the OpenRasta messageboard about how to go about implementing custom security.

Day One

I spent a large part of day one reading up on OpenRasta and re-reading the above chapter. I wanted to implement OAuth in the same way as they had in the book, where they are using OAuth to securely request “vouchers” redeemable against the payment endpoint in their restbucks app.

Using Attributes

Open Rasta (2.03), as a framework already has a few classes built in that allow you to hook in your own custom authentication. But in this version they are very limited. They consist of the following:

  • AuthenticationResult – a lightweight object that contains information about an AuthenticationRequest
  • IAuthenticationScheme – an interface that provides Challenge and Authenticate methods.

I also noticed that they allowed for the use of a [RequiresAuthentication] attribute which I got excited about as it allowed for the implementation of an Authentication strategy on a per method basis. This then hooks up a RequiresAuthenticationInterceptor when run and adds it to the list of operation interceptors for this call only.

RequiresOAuthAttribute

Therefore, I followed this exact same procedure for OAuth and created the [RequiresOAuth] attribute and added a RequiresOAuthInterceptor.

[AttributeUsage(AttributeTargets.Method | AttributeTargets.Class, AllowMultiple = false, Inherited = true)]
 public class RequiresOAuthAttribute : InterceptorProviderAttribute
 {
      public override IEnumerable<IOperationInterceptor> GetInterceptors(IOperation operation)
      {
            return new[]{
               new RequiresOAuthInterceptor(DependencyManager.GetService<ICommunicationContext>(),
                     DependencyManager.GetService<IAuthenticationScheme>())
            };
      }
 }

The interceptor itself had very little logic :

public override bool BeforeExecute(IOperation operation)
 {
        AuthenticationResult authenticationResult
            = _scheme.Authenticate(_context.Request);
        if (authenticationResult is AuthenticationResult.Failed)
        {
                _scheme.Challenge(_context.Response);
                _context.OperationResult
                      = new OperationResult.Unauthorized
                            { ResponseResource = "You aint got no authoritah!!" };
                return false;
        }
        return true;
 }

As you can see, it uses the current IAuthenticationScheme (in this instance the RequiresOauthAuthenticaitonScheme) to check authentication, and stops the pipeline with the relevant OperationResult if AuthenticaitonResult.Failed.

Dependency Resolver problems

One of the immediate issues I came across was the fact that even though I was only implementing my RequiresOAuth attribute for my VoucherHandler Delete method, it was being run for every request. I quickly tracked this down to my Dependency Resolver “LocalInstaller” class which was setting up a default instance of all abstracts and interfaces using the following line:

AllTypes.FromAssembly(typeof(LocalInstaller).Assembly)
         .Pick()
         .WithService.FirstInterface()

In order to prevent default OperationInterceptors being setup I added the following line between Pick and WithService:

.Unless(x => x.IsSubclassOf(typeof(OperationInterceptor)))

You can then explicitly set up your OperationInterceptors for all methods within the ConfigurationSource class, or an any other time, in this instance for using my Attribute. This allows for OperationInterceptors to be run on a per method basis!

This would be also very useful for validating requests per methods etc.

Day Two

REST in practice implementation

I spent day 2 attempting to figure out a way to implement the model outlined in REST in practice. Outlined, this is as follows:

  • User attempts to pay with voucher
PUT /payment/1234
  • Consumer (RESTbucks) attempts to redeem the voucher
DELETE /voucher/1234
  • Voucher challenges redemption
401 Unauthenticated - WWW-Authenticate OAuth realm="http://vouchers-service"
  • Consumer tries to acquire a set of temporary creds
POST /requestToken/voucher/1234
  • Voucher Service creates and returns request token and token secret
  • Consumer uses creds request accessToken
POST /accessToken/voucher/1234
  • Voucher Service creates and returns access token and token secret
  • Consumer then attempts to redeem the voucher with the Authorization credentials passed within the Authorization header.
DELETE /voucher/1234

I missed out a step where the user signs in to authorise the redemption, and their creds are sent to the voucher service, but I’d leave that for now.

OAuth Authentication Scheme

Carrying on with my implementation on day one, I began to create my own custom IAuthenticationScheme as discussed in this post.

Authenticate

This is where the logic for authentication would sit. I didn’t manage to implement all of it as I ran out of time. I quickly realised that as I had only specified one attribute for OAuth, I was going to have to include logic for handling both request token retrieval, access token retrieval and the final access token authentication check.

I used hard coded credentials at this point, just to be able to test, as this was only really a basic prototype.

The basic logic was this:

  • If no Authorization header retrieved, return new AuthenticationResult.Failed();
  • If any of the creds don’t match the expected creds, return new AuthenticationResult.MalformedCredentials() along with a message describing the issue.
  • Check either request token or access token against expected, if they match return OAuthSuccess() which is a custom implementation of AuthenticationResult()
  • If they don’t match, return new AuthenticationResult.Failed()

I would have liked to take this all further and actually implement to correct hashing / encrypting algorithms but I ran out of time.

Challenge WWW-Authenticate response

The Challenge method allowed me to specify what to return as the WWW-Authenticate header. The logic for this then exists in the consuming class, in this instance the Interceptor.

if (authenticationResult is AuthenticationResult.Failed)
 {
       _scheme.Challenge(_context.Response);
 }

In this instance I just opt for attaching the Challenge to the context response. You could push the whole output into this.

With a 401 unauthorised error you are supposed to add this WWW-Authenticate to the response header so that the consumer knows how it is meant to authenticate.

Sadly, our cut of OpenRasta had a know issue where any response that contained an OperationResult.Unauthorised was hard coded to have the WWW-Authenticate header returned as “Digest” (see this thread)

Problems
  • Apparently, OpenRasta 2.1 has a different and better way of implementing customer authorization. I wasn’t able to look into this and as I was using our fork which is based on OpenRasta 2.03 I wasn’t able to take advantage of this.
  • As OAuth has quite a steep learning curve, I was a little mixed as to how I should implement it. As I got closer to the implementation I realised that a one hit authorisation checking mechanism was not going to cut it. But by then it was too late to start again.
Still to do
  • Implement the OAuth request token and access token validation logic
  • Write acceptance tests to check implementation between a consumer and a service
  • Implement the exact logic as outlined in “Rest in Practice”
Further reading

REST in practice

OAuth

Beginners guide to OAuth

OpenRasta

OpenRasta messageboard

Posted in OpenRasta, REST, Software Development | Tagged , | Leave a comment

Testing PerWebRequest Instances in StructureMap

When bootstrapping a structure map registry, you are able to set the “life style”  of that particular instance using Structuremaps fluent interface.

For example, when using NHibernate, it is essential that you set up ISessionFactory to be a Singleton and ISession to be on a per Http Request basis (achievable with StructureMaps HybridHttpOrThreadLocalScoped directive).

Example:

	For()
		.Singleton()
		.Use(SessionFactoryBuilder.BuildFor("MY.DSN.NAME", typeof(TokenMap).Assembly))
		.Named("MyInstanceName");
	For()
		.HybridHttpOrThreadLocalScoped()
		.Use(context =>;
			context.GetInstance("MyInstanceName")
				.OpenSession())
		.Named("MyInstanceName");

It’s nice and easy to test a Singleton was created with a Unit Test like so:

[TestFixtureSetUp]
public void FixtureSetup(){
	ObjectFactory.Initialize(ctx => ctx.AddRegistry(new NHibernateRegistry()));
}

[Test]
public void SessionBuilder_should_be_singleton(){
	var sessionBuilder1 = ObjectFactory.GetInstance();
	var sessionBuilder2 = ObjectFactory.GetInstance();
	Assert.That(sessionBuilder1, Is.SameAs(sessionBuilder2));
}

But how can you adequately test that you are receiving a new instance of ISession for every HttpRequest? One way we discovered was to do the following -

Create a basic class that provides a HttpContext, the key is to set the static HttpContext.Current to be the faked HttpContext:

public class MockHttpContext : IDisposable {
	private readonly StringWriter _stringWriter;
	private readonly HttpContext _httpContext;

	public MockHttpContext() {
		_httpContext = HttpContext.Current;
		var httpRequest =
			new HttpRequest("test.aspx",
						"http://localhost",
						"?a=1&b=2");
		_stringWriter = new StringWriter();
		var httpResponse = new HttpResponse(_stringWriter);
		HttpContext.Current = new HttpContext(httpRequest, httpResponse);
	}

	public void Dispose() {
		_stringWriter.Dispose();
		HttpContext.Current = _httpContext;
	}
}

Then you can simply wrap each instance retrieved within a using statement thus:

[Test]
public void ISession_should_be_different_on_per_request_basis() {
	ISession sessionA;

	using (new MockHttpContext()) {
		sessionA = ObjectFactory.GetInstance();
		Assert.That(sessionA, Is.Not.Null);
	}

	using (new MockHttpContext()) {
		var sessionB = ObjectFactory.GetInstance();
		Assert.That(sessionA, Is.Not.SameAs(sessionB));
	}
}

And subsequently therefore:

[Test]
public void ISession_should_be_the_same_within_a_single_request() {
	using (new MockHttpContext()){
		var sessionA = ObjectFactory.GetInstance();
		var sessionB = ObjectFactory.GetInstance();
		Assert.That(sessionA, Is.SameAs(sessionB));
	}
}

And there you have it, the same instance within a single request, and a different instance within 2 different requests.

Also, you can use exactly the same method when testing CastleWindsors LifeStyle.PerWebRequest call.

 

Posted in Software Development, StructureMap | Tagged | Leave a comment

HtmlAgility web scraping

Introduction

I only had a day for this one so I decided to attempt something quick and easy. I decided to use HtmlAgility and the extensions offered by Fizzler to attempt to scrape results from the British Horse Racing website, so I can ultimately stash them in a Solr index or similar.

HtmlAgility and Fizzler

HtmlAgility is a very useful library that allows an IXPathNavigable Xml interface around an XHTML DOM, which in turn allows you to treat data pulled from a web page as you would normal XML, including using XPath to access particular nodes. It also is very tolerant of malformed XML which makes it good for webscraping.

Fizzler adds Extension methods to HtmlAgilities HtmlNode class (an implementation of the .NET IXpathNavigable). These methods give you the ability (among other things) to select IEnumerables of HtmlNode objects based on CSS 3.0 selectors.

Within HtmlAgility, a document is loaded into a concrete HtmlDocument in a similar way to XmlDocument:

var html = new HtmlDocument();
 html.LoadHtml(myHtmlString);

You can then access the nodes within the current DocumentNode property with methods and Properties that will be familiar with anyone who has used the .NET System.Xml XmlDocument.

Fizzler adds a few extension methods that return IEnumerable<HtmlNode>, e.g. Descendants(), NodesAfterSelf() etc, but the interesting ones are the QuerySelector methods that allow the use of CSS Selectors:

html.DocumentNode.QuerySelectorAll("p.selected")

which returns an IEnumerable of HtmlNodes that match the above CSS selector criteria (e.g. a list of <p> tags with a class=”selected” attribute), and

html.DocumentNode.QuerySelector("div#myDiv")

which returns a single HtmlNode matching the specified CSS selector.

Getting started

I spent the morning playing about with Fizzler, writing tests to see what it could do. I really like the idea of being able to select nodes using CSS selectors so had a fiddle around to see which ones were supported by comparing them with the CSS3 Selectors. Turns out they aren’t all supported, :first-child and :nth-child(index) are supported, but nth-last-of-type(index) and :first-letter are not.

I then attempted to implement them against the British Horse Racing website, to pull results data into manageable DTOs.

The first problem I came across was that not all the data was available on one page, so I was going to have to do some crawling. In order to accomplish this I attempted to create a fluent interface that maintained the current state of the HtmlDocument.

Fluent HtmlSelector

I started out with the following interface:

public interface IHtmlSelector
{
     IHtmlSelector MoveToNode(string selector);
     IHtmlSelector ClickLink(string selector);

     IEnumerable<HtmlNode> Build();
     IEnumerable<HtmlNode> SelectNodes(string selector);
     HtmlNode CurrentNode();
}

This gave me the ability to move to a particular node and have that as the current Node to run the next selection on. It also gave me the ability to return node as a list of HtmlNode, but also to be able to select a link defined within the selector, click through to that page and have it’s html loaded into the current HtmlDocument.

Adding History

I quickly realised that this approach threw up a problem. The website consists of 3 different pages to enable you to access the final results data.

  1. Courses
  2. Races on that day
  3. The results themselves

Once you click through to the races, and you have retrieved the data, you need to go back to the previous HtmlNode set. In order to handle this I used a Stack<HtmlNode> which is pushed to by the ClickLink method:

_history.Push(_document.DocumentNode);

I then added another method to the interface:

IHtmlSelector GoBack();

Which pops the last history item from the stack and sets it as the current DocumentNode like this.

_document.LoadHtml(_history.Pop().OuterHtml);

DocumentNode has no setter, so you have to use the LoadHtml method and pass it in the xml as a string. Not ideal, but was the only way I could see to do it.

Issues with Fizzler Extensions

Because they are extension methods, the Fizzler methods are tricky to stub within a unit test, so it made unit testing code that consumes these methods difficult. Injecting a stubbed collaborator was not really an option, and also due to time constraints, I stopped unit test driving the actual web scraping portion, opting more for a higher level Acceptance test to drive development.

I would have preferred a decorator implementation, where the HtmlDocument could be wrapped in something like a CSSSelectorHtmlDocument class, which then added functionality. But this is just personal preference.

Other than that, it’s a great little tool that enabled me to cobble something together relatively quickly.

Code is available here

To do list

One thing I really wanted to do was to be able to have the nodes returned as Linq to Xml XElements, but I ran out of time.

I also wanted to hook the data up to Solr, but this shouldn’t be a problem. Finally I wanted to be able to access the results for a particular day, to make it easier for a service to pull the data. Again, this should be relatively simple, the results are available on the archive page alongside their respective dates.

Further reading

HtmlAgilityPack

Fizzler

CCS3 Selectors

Introducing Fizzler

Posted in Software Development | Tagged , | Leave a comment

REST in Practice and OpenRasta

After having read the o’Reilly book “REST in Practice” , I set myself the challenge of using OpenRasta to create a basic RESTful web service.

I decided for the first day to just concentrate on getting a basic CRUD app as outlined in chapter 4 working. This involved the ability to create, read, update and delete physical file xml representations of Artists. It is described in the book as a Level 2 application on Richardson’s maturity model, as it doesn’t make use of Hypermedia yet.

One reason why OpenRasta is such a good framework to implement a RESTful service is that it deals with “resources” and their representations. As outlined in “REST in Practice”, a resource is defined as any resource accessible via a URI, and OpenRasta deals with this perfectly as it was built to handle this model from the ground up.

The Basic Web Service

For the basic web service I created an ArtistHandler in the normal OpenRasta way, creating c# methods within the Handler for each of these four HTTP verbs:

  • GET for reading.
  • POST for creating.
  • PUT for updating.
  • DELETE for deleting.

I used the [HttpOperation] attributes just to make the relationship between the method and the verb more explicit, OpenRasta will actually auto map a method with the name Post() to the POST verb and so on.

The main aim of this exercise was to discover exactly what http response statuses and headers I should be returning, and whether it was possible to adhere strictly to the guidelines using OpenRasta.

The HTTP template I used for the endpoint was:

/artist/{artistId}

Http Responses

The Responses I wanted to give were structured as they are outlined in the book, and by 3w.org e.g:

GET /artist/{artistId}
  • Returns a 400 BadRequest along with a list of errors, if artistId not supplied.
  • Returns a 404 NotFound if record for that artist is not found
  • Returns a 200 OK along with the record if the record was found
  • Returns a 500 Internal Server Error on exception
POST /artist
  • Returns a 400 BadRequest along with a list of errors, if any parameters not supplied.
  • Returns a 302 Found along with the Location uri of the resource if it already exists.
  • Returns a 201 Created along with the Location uri of the new resource on success (this could also contain the body of the new resource)
  • Returns a 500 Internal Server Error on exception
PUT /artist/{artistId}
  • Returns a 400 BadRequest along with a list of errors, if any parameters not supplied.
  • Returns a 404 NotFound if record for that artist is not found
  • Returns a 204 NoContent along with the Location uri of the updated resource on success(not sure about this myself, but was specified in the book)
  • Returns a 500 Internal Server Error on exception
DELETE /artist/{artistId}
  • Returns a 400 BadRequest along with a list of errors, if any parameters not supplied.
  • Returns a 404 NotFound if record for that artist is not found
  • Returns a 204 NoContent on success.
  • Returns a 405 MethodNotAllowed on any IO exception
  • Returns a 503 Service Unavailable on any other exception

Issues

I had a couple of small issues with responses and OpenRasta. For instance, there is not a set OperationResult representing a 503 Service Unavailable response, but I could create my own by changing some settings in an InternalServerError Response, or by inheriting from the OperationResult class so no problems there.

Also, I wasn’t able to pass POX (Plain Old Xml) to the POST endpoint without OpenRasta throwing an internal exception, something which I’ll have a look at in due course.

Using Curl

I used Curl to test the endpoints, I tried Fiddler, but OpenRasta would always return a 415 Media Not Supported response. I imagine this was due to one of the headers not being specified properly, again this may be worth looking into but due to time constraints I didn’t bother. Using Curl is quick and easy, I just used variations on the following:

$ curl -v "http://localhost/restful_service/artist" -X "POST" -d "Id=100022&Name=WASP&Genre=LameRock"

Reaching Level 3

One thing you need to do to make a service move towards a Level 3 rating, is to offer up links to be able to access endpoints related to this resource, e.g. links to page to the previous or next record, or a link to fulfil or pay for an order. As a nod to this, I created a link to DELETE a record that is returned when you GET an artist e.g.

<link rel="artist" href="http://localhost/restful_service/artist/10010" method="DELETE"/>

“REST in practice” recommends the use of Atom feeds to truly create a Level 3 restful service, but Martin Fowlers post on the Richardson maturity Model suggests simply using standard html style link tags like I have used for the DELETE link above.

Wish List

There were many other things I would have liked to look at, namely Caching, E-Tags, creating Atom feeds and implementing OAuth, but I ran out of time.  At the time of writing, OpenRasta does not support OAuth out of the box, but according to this post it is something they are looking into.

An interesting move forward would be to create an

OAuthAuthenticationScheme : IAuthenticationScheme

within our own fork of OpenRasta. (https://github.com/7digital/openrasta-stable)

You can grab the project from here:

(https://github.com/gregsochanik/RESTfulService)

Links

Posted in REST, Software Development | Tagged , | Leave a comment