By Passion.dk
Software Development, Agile Principles and Entreprenuership Meets Real Life
Show Menu

Webjobs custom naming with INameResolver

July 1, 2014   

With the latest release of azure webjobs sdk it is also supported to have custom name resolving of triggers and output. Before the only option was to make it explicit in the code. With the new name resolving it’s now possible to differente names across deployment environments etc.

The magic starts with the new JobHostConfiguration and the interface INameResolver

static void Main(string[] args)
{
    var config =
        new JobHostConfiguration(
            "DefaultEndpointsProtocol=http;AccountName=YourAccountNameHere;AccountKey=YourKeyHere")
        {
            NameResolver = new QueueNameResolver()
        };
    var host = new JobHost(config);
    host.RunAndBlock();
}

Compared to the previous release where the JobHost got the connection string as a parameter. The benefit of this change is that we can now replace the default NameResolver by a property. Here I added my own QueueNameResolver which looks inside the configuration, but this is not a limitation just an option :).

public class QueueNameResolver : INameResolver
{
    public string Resolve(string name)
    {
        return CloudConfigurationManager.GetSetting(name);

        //or whatever logic you want here to resolve
    }
}

To trigger the new name resolver, simply change the signature of your webjob handler method to include a %key% as the argument.

void JobHandler([QueueTrigger("%signupQueueKey%")] Customer newCustomer, [Blob("verifiedcustomers/{Email}.json")] out VerifiedCustomer customer)

If we run the job and break at the Resolve method inside our custom resolver, we will now see the key we provided as the argument.
inameresolver

Azure webjobs SDK basic parameter binding

July 1, 2014   

In my previous blog post I write about how to use logging with the most basic queing mechanisme in Azure webjob sdk, where the message in the queue is just a string.

void Handler([QueueInput("signupqueue")] string customername)
{
    // magic goes here...
}

A small update on the latest webjobs sdk 0.3.0-beta-preview

Since I write the blog post new preview of the webjobs sdk has been released and beside the crazy MS versioning it just keeps getting better and better. In relation to the previous blogpost, the input and output parameters has changed a bit. Now the above would be written as:

void Handler([QueueTrigger("signupqueue")] string customername, Queue("confirmedcustomers") customer)
{
    // magic goes here...
}

queueinput has been changed to queuetrigger and queueoutput has been changed to just queue.

Back to the parameter naming

In the previous solution the input was a simple string which is a good for small simple solutions, but in most solutions we tend to have more enriched data like a customer, and order etc. Let’s make a sample where users sign up on a form, an everyday web scenario. They enter data in a form and submit it to a server. Now instead of just making all the business logic directly at the endpoint, we could instead push it to a queue, and let a well crafted handler process the request data. Doing it this way we would be able to return fast to the user and make more decoupled and scalable code by having seperate handlers for specific tasks, as in this case a sign up process.

To do this we need to change the signature of the handler just a bit

void Handler([QueueTrigger("signupqueue")] Customer newCustomer)
{
     // more magic here
}

And add the class so the binding mechnism knows the format of the input type Customer.

public class Customer
{
      public string Name { get; set; }
      public string Email { get; set; }
}

If we by now go to the server tools and browse the azure queue the webjob is using, we can add (1/2) a simple json snippet (3) to verify the binding is working as expected. This json will now be our sign up message in the format we are as input from the queue. When we run the code, and break on the handler, we can see the webjob binding mechanism in action, as our customer now holds the values form the message, pretty neat and familar if you have been using ASPNET MVC etc.

addToQueue

Now let’s take the sample a bit further. Let’s say the result of the handler should be streamed to a blob inside our Azure storage. The blob should be stored in a container named verifiedcustomers in a json file named by the customers email.

First we need to change the signature. We add a second parameter to the handler, the output parameter.

[Blob("verifiedcustomers/{Email}.json")] out VerifiedCustomer customer

We define the output as a blob with the name of the container and then a placeholder {Email}. This is a reference to the Email property on the input parameter Customer . Finally we make a suffix to indicate it’s a json file. Now there is a final change, because our verified customers also holds a verified date, we model them as a VerifiedCustomer. The final handler signature wola!

NewCustomerEmailHandler([QueueTrigger("signupqueue")] Customer newCustomer, [Blob("verifiedcustomers/{Email}.json")] out VerifiedCustomer customer)

Custom binding…

If you run the job by now and add a message to the queue, the result will be an IndexException with the message “Can’t bind to parameter VerifiedCustomer….”. This happens because the instance of VerifiedCustomer is being streamed to container, and as it is by now there is no default serialization to handle this. Therefore we need to support for this, by implementing the interface ICloudBlobStreamBinder and implement its writer and reader method. A simple implementation can be made using json.net

public class VerifiedCustomer : ICloudBlobStreamBinder<VerifiedCustomer>
{
    public string Name { get; set; }
    public string Email { get; set; }
    public DateTime Verified { get; set; }

    public VerifiedCustomer ReadFromStream(Stream input)
    {

        StreamReader reader = new StreamReader(input);
        var data = reader.ReadLine();
        return JsonConvert.DeserializeObject<VerifiedCustomer>(data);
    }

    public void WriteToStream(VerifiedCustomer customer, Stream output)
    {
        var data = JsonConvert.SerializeObject(customer);

        using (StreamWriter writer = new StreamWriter(output))
        {
            writer.Write(data);
            writer.Flush();
        }
    }
}

If we rerun the code and add a new customer to the signup queue, the result will be a new blob in our blobstorage container

blob_in_container

 

 

Hosting Azure webjobs outside Azure, with the logging benefits from an Azure hosted webjob

June 9, 2014   

First there is the Azure Webjob SDK

A little after Azure Webjobs was first introduced, the AzureWeb job SDK was also introduced, making it possible to trigger a job by an “event” like a message in a queue, a new blob etc. This makes it a lot more frictionless and clean to make a job being triggered by some event, compared to implement the trigger yourself inside the job with with polling.

The basic job

First we need to create a regular console application using Visual Studio and install the nuget package Microsoft.WindowsAzure.Jobs.host , to enable the webjob sdk features. Please not that that this package is currently in prelease so the command to nuget should include the prerelease option:

install-package microsoft.windowsazure.jobs.host -Prerelease

With this package, we can create our worker method. For simplicity we create a simple method that gets triggered by a message in a queue and log the input.

public static void Capture([QueueInput("orderqueue")] string orderMessage)
{
     Console.WriteLine(orderMessage);
     //real work goes here.....
}

This tells the webjob SDK to look for a message in a queue named orderqueue in your storage account, so we also need to specify the storage account details.

static void Main(string[] args)
{
    var host = new JobHost("DefaultEndpointsProtocol=http;AccountName=YourAccountNameHere;AccountKey=YourAccountKeyHere");
    host.RunAndBlock();
}

Specify a new JobHost and start it inside the main method. You need to replace  [AccountName] and [AccountKey] with your own credentials. This will enable you to reference the orderqueue used in the worker method. All the wiring is done “automatically” by the webjob SDK.

Hosting the WebJob outside the Website webjobs

Under normal circumstances you would host your webjob as part of an azure website. The reason I want to make additional hosting outside the azure environment, is because I’m using a library in a another projec that’s referencing GDI, and GDI is not supported by Azure webjobs yet. So a great opportunity to show additional webjob hosting.

Deploy and start the webjob

The deployment is so simple, just copy your webjob to a VM or run it directly on your own machine. Start the job as you would with a regular console application.

Linking to an Azure website to enable logging

To enable logging for the webjob, we need to create an Azure website as we would for normal hosting. But the trick is that inside the Configure section we add a connectionstring for the AzureJobsRuntime. If you are only viewing logs as illustrated below by Visual Studio, this step can be omitted.

 

Logging from the VM to Azure Website

Finally it’s time to show it all in action. From within Visual Studio server explorer, create a new queue with the name we used earlier orderqueue.

serverexplorer_queue

Open the queue and “add new message”. We can do this as we are using a basic string for input, so just wite “Hello Log” and press ok, the message is now in the queue.

If not started, then start the webjob. The job will soon pick up the message you added to the queue, illustrated as this in the cnsole.

capture_message

To view the content of the message, open the Server Explorer and open the Blob node under your storage account.

storage_log

The log should now contain 2 entries. Click the largest one and you should now see your message content statement inside.

job_log

 

notepad_log_msg

So this is all it takes to enable logging on a webjob hosted outside an Azure website.

 

How To Use Google Analytics From C# With OAuth

March 2, 2014   

logo Many people knows about and uses Google Analytics for their websites and apps to collect user behavior and other useful statistics. Most of the time people also uses the collected information directly from the Google Analytics website. On one of the recent project I’ve been working on we needed to extract data from Google Analytics running a C# service with NO user interaction aka a background service. This blog post is a summary of the roadblocks I stumbled upon to get the job done. All code depends on the Google Analytics V3 API and the source code for this post is available on GitHub.

Setting up the C# project and adding Google Analytics dependencies

For the purpose of this sample we will use a standard C# console project, note that I’m running Visual Studio 2013 and .NET 4.5, This will have an impact described later. When the project is created we need to include the Google Analytics nuget packages. The package is currently i beta, but I haven’t had any issues using it (The Google Analytics v3 is not in beta). From the Visual Studio Package Manager Console search for the Prerelease package named google.apis.analytics.v3. Using the following command

get-package -Filter google.apis.analytics.v3 -ListAvailable –prerelease

To install the package execute the following command

get-package google.apis.analytics.v3 –prerelease

There is quite a lot of dependencies, so I won’t post a screenshot. The important part is just to ensure that the –prerelease option is included. To proceed from here we need to get the needed keys from Google Analytics and setup OAuth Authentication.

Setting up Google Analytics to allows read access from a native service application

To extract data from Google Analytics, we need to make a project inside Google Analytics. This project is used to manage the access and keeping track of the request usage, which we will look at later. Go to the Google Developer Console and create a project. The project will be listed in the overview, here you can see my project named “My Project”ga_project_list Selecting the project after it is create displays the following screen where we need to enable the Analytics API. If you need access to other APIs later on, this is the place to go.enable_GA

Enabling OAuth access to the project

For this project we want to use OAuth for authentication. Google provides multiple ways of doing depending on the use case. From the Credentials screen select “Create New Client Id”.create_clientid Here it’s important to select  Installed application and other. Click Create Client Id, and we get back to the project overview with the new client id listed. Record both the Client ID and Client Secrect as we need both these later from the program. That’s it, now we are ready to start coding and access the API! client_id_list

Setting up AnalyticsService in C#

Going back to Visual Studio and the project we created earlier, insert the following code into the main of the program. Note that this is only sample code :).

class Program
 {
 static string clientId = "INSERT CLIENTID HERE";
 static string clientSecret = "INSERT CLIENTSECRET HERE";
 static string gaUser = "YOURACCOUNT@gmail.com";
 static string gaApplication = "My Project";
 static string oauthTokenFilestorage = "My Project Storage";

private static AnalyticsService analyticsService;

private static void Main(string[] args)
 {
 var credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
 new ClientSecrets
 {
 ClientId = clientId,
 ClientSecret = clientSecret
 },
 new[] {AnalyticsService.Scope.AnalyticsReadonly},
 gaUser,
 CancellationToken.None,
 new FileDataStore(oauthTokenFilestorage)
 ).Result;

var service = new AnalyticsService(new BaseClientService.Initializer()
 {
 HttpClientInitializer = credential,
 ApplicationName = gaApplication
 });
 }
 }

What’s happening here is we create the AnalyticsService, which is the service we are going to use when sending requests to Google Analytics. We provide it with the informations we got earlier creating the project inside the Google Developer Console. Also note that for this purpose we are using the scope AnalyticsReadOnly. Finally we are giving an OauthTokenFileStorage. This is the storage that google will use to store the oauth token which are issued during each requests between Google Analytic and our application. On your machine this will be physically located at %AppData%StorageName where storage is the name we specified using OauthTokenFileStorage. It is possible to change this to your own implementation using a database etc. But that will be covered in a later blog post, for now we will just use the file storage.

Ready to Authenticate with Google Analytics

When the information in the source code is updated to match your own settings, we are ready to test that we can initialize the oauth authentication. As mentioned earlier we will run this service  with no user interaction. However the first time we need to run it, we need to do it “manually” meaning run it from a console. This is needed as we will be prompted to allow access and get the initial access_token but more over the refresh_token. The refresh token is the one used when the access token expires to renew it. When we run this the first time we will get the token information in a file inside the oauth storage specified by our app. From here on, we can copy it to where ever we want, both to our own machine of it could be to a server running our service. So the user interaction is only a one time thingy. If you change Account or Profile id (developemnt, staging and production etc.), you need to redo this step! Hit F5 and our application starts and….crash!.

Could not load file or assembly ‘Microsoft.Threading.Tasks.Extensions.Desktop, Version=1.0.16.0, Culture=neutral…….

Now what just happened? Because we are running .NET 4.5 things have changed a bit. If we do a little investigation (Thanks to Google and Stackoverflow) we can see that this file should be included by the Microsoft.BCL.Async  Nuget Package A quick look into the packages\Microsoft.Bcl.Async.1.0.165 unveils we have a later version and that the assembly is NOT included in the 4.5 version of the packagebclmismatch To ensure we can authenticate add a reference to the 4.0 file we are missing.Simply browse to the package you included and you’ll find both the 4.0 and 4.5 assemblies. Next up add an assembly binding redirect in the projects app.config file to redirect to the previous versionassemblybinding

Now with both the assembly reference and assemblybinding in place, run the project again. This time the project launches a browser where we are asked to allow this application to access to your Google Analytics data.

authgoogle

If we take a look into the OauthTokenFileStorage located in %appData%YourStorageName We should now see a file containing our token information accesstoken, refreshtoken and some other information. Remember that the most important part here is the refresh token, as this one is used to renew the access token, whenever it expires.

The 1st Request To Analytics Data

By now everything is ready and we can initiate the first request. Enough writing let’s go the code first way.

 string start = new DateTime(2014, 1, 1).ToString("yyyy-MM-dd");
            string end = new DateTime(2014, 1, 10).ToString("yyyy-MM-dd");
            var query = service.Data.Ga.Get(profileId, start, end, "ga:visitors");

            query.Dimensions = "ga:visitCount, ga:date, ga:visitorType";
            query.Filters = "ga:visitorType==New Visitor";
            query.SamplingLevel = DataResource.GaResource.GetRequest.SamplingLevelEnum.HIGHERPRECISION;

            var response = query.Execute();

            Console.WriteLine("Entries in result: {0}", response.TotalResults);
            Console.WriteLine("You had : {0} new visitors from {1} to {2}"
                , response.TotalsForAllResults.First(), start, end);
            Console.WriteLine("Has more data: {0}", response.NextLink == null);
            Console.WriteLine("Sample data: {0}", response.ContainsSampledData);

Building the request

For the purpose of this post we will just make a simple query that will “Get the new visitors to my blog and not the returning ones”. First up we will use the GET method of the AnalyticsService. This method takes a profileId, the date range we are requesting within and finally the metric we want to measure. The profileId is the unique id describing the “table” to the account we are requesting. To get the profileId, navigate to the administration section in your Google Analytics account. From here you click the “View settings” of the View column and you’ll see the ProfileId mentioned as View Id viewsettings profileIdWith the query created, we are able to define the dimensions (output) and additional filters to further narrow down the result set. For the purpose of this blog post I’ve added New visitors as a filter, to filter away Returning visitors. Before looking at the response result format, please take a look at the format of the input parameters. Both profileId, metric(s), dimension(s) and filter(s) are all prefixed with ga: This is mandatory and if not you will get really weird errors. For a detailed description take a look at the Core Reporting API – Reference Guide. Same goes for the date format.

Handling the response

Ones the request is ready we call Execute it to get the response. The result contains very detailed meta information besides the actual result. As you can see I’ve listed 4: TotalResults, TotalsForAllResults, NextLink and ContainsSampledData. TotalResults is a count of the entries contained in this response, TotalsForAllResults is a sum of all according to the metric. NextLink is an indication of whether we got all results in this response, and if not it points to the next result set. There is a limit of 1000 results per request, but the NextLink makes it very easy to navigate to the next result set. Finally the ContainsSampledData is an indication of whether the result is sampled or not. To get a more precise extract you should limit the request date range and also set the SamplingLevel of the request to HighPrecision.

Debugging requests

While building the requests the Google API Explorer comes in very handy. Here you can test all your input and investigate the request and its details.

At the end of the road there is a stop sign

My final note on the basic request setup is a heads up to the default request Limits and Quota defined by Google. Please get a good understanding of where your limits are, both in number of requests and frequency. And when you exceed them as I did this Error Responses is a good place to start implementing a more solid foundation. To follow your request usage of the current project/profile, use the Old Console Monitor access able from the Developer Consoleaccess_monitor Click the APIs, select the Analytics API we enabled earlier and you’ll see the menu to access both the quota monitor and the Report. The Report is the access to monitor your current and previous usage and the quota will enable you to increase request frequency etc.monitor  So far the new Cloud console (see the link in the screenshot) doesn’t seem to monitor correctly and the update frequency is very low. For now use the “old” one, where you can also download a request report on a day level see failed requests etc. I hope this demystifies and guides to a better understanding of the basics to access Google Analytics from C#. All source code is available on GitHub

Execute Book

November 16, 2013   

Ok I’ll admit, one of my challenges as a developer is to execute. Often I get very curious in new technologies, which might harm the execution of the product I’m working on. I’m aware of it and after reading this great book I really got inspired and got some take aways to improve my self.

This great book by Drew Wilson & Josh Long is written in only 8 days! It’s about a product build within only 8 days SpaceBox.io. I read a tweet about the book, watched the video and quickly identified my self :). I bought the book and 1½ week later I’ve read it from start to end, very easy to read and a loy all the way.

I really like how the book is built and you can almost feel the energy of the writer as you read along. It might be easier for a software entrepreneur to identify with the content. You will get a lot of input to how and why to forget your old habits and start to execute. Do the thing you feel and do it now all you need is focus. There isen’t always a need to make the big plans as this will often be a roadblock towards shipping what you’re building no one knows the future and you can’t plan what your product will be inthe future. Ship and get feedback.

I think there is more than one target group for this book. First there is the entrepreneurs who wants to create and ship products, but the book might as well be an inspiration in software companies building projects for clients. The latter part of the book has some great perspective on the use of technology and how to get you moving forward. Interesting enough the role of the “builder” in the book could be named a fullstack developer, one that cares and can work en every phase of a products lifecycle. You will learn as you go along and features should be what drives your learning.

I really enjoyed reading the book and would recommend it to everyone who’s interested building products and ship’em.

Buy the book here. http://executebook.com/

The $100 Startup – value for every cent

October 27, 2013   

100startup

For some time I’ve been reading this fantastic book by Chris Guillebeau, finally finished and here are my thoughts.

I originally found the book based one some references at Amazon, and thought of it as a fast read and an energy kick. I wasen’t let down.
Overall the book describesa lot of startups in different business area, all started based on a very little amount of money, the $100. The idea behind most of them has not been to create a million dollar business, but to follow a passion for each of the people behind and change their way of living. One remarkable thing is that it is not yet another Facebook story about a group of IT students. It’s stories among all level of society from the homeless, to the manager who just love to travel, and the mom who wants to spend more time with her children and family. Follow your dreams and trust in it.
A big part of the book is written as guides, also available on the website . Easy to get going guides, the micro business plan, how to think about monitize your business and later how to increase your sale as just a few example. Exactly this part is what makes the book excellent and also makes it as a reference. I personally know that I will reread some of the chapters later again. The hints are so easy to understand and bring to live for every one. While reading the book I had to take a break from time to time, because I started to think! Think about all the inputs I got from the book and started to see if I could use them in some way for my own ideas of a business. I would recommend this book to anyone interested in entrepreneurship and small businesses, you wont be let down.

Find the book here

The books homepage 100startup

The Attackers Are Here, don’t wait!

October 1, 2013   

Don’t wait, when you deploy your online you are a subject to attack. Web security as one of the main topics at GOTO today, and could very well be explained by this very simple comic

First up Aaron Bedra did a great talk talk about some of the defence mechanism you can use to protect your online services and especially how to look for anomalies in the requests you receive. He demostrated ModSecurity a rule based web application firewall. But ModSecurity can not stand alone, instead it should be used together with Repsheet etc.

Repsheet is a collection of tools to help improve awareness of robots and bad actors visiting your web applications. It is primarily an Apache web server module with a couple of addons that help organize and aggregate the data that it collects.

Aaron demostrated different indicators to look for in incoming requests, and as indicators to fraud etc. stacked up in the request, the chance for the request being an attempt to attack increases. Aaron really made his point and demostrated som great examples. The RepSheet really seems like a great product an could be a great addon to PaaS providers like Azure.

The second part of the web security sessions was faced towards the possible attacks one could make to an online service. The session was presented by Niall Merrigan and covered the items in the OWASP top 10. As he started to state

Developers don’t write secure code because we trust, are lazy and security is hard

And this sound just about right, but as he pointed out example by example, we might consider to understand just a bit more about security and put an effort into it. Very simple he googled for web.config and got access to some running services’ web.config files, ready to be downloaded. Later he pointed out that IIS would protect you from having your web.config file downloaded. But with a simple trick, there is a basic workaround and suddenly your web.config information like connectionstrings etc. are public and could easily be exploited.

No clear text passwords!!

Of course you could hash your passwords, but if one already got your web.config file, he also got your machinekey and suddenly the world is not that secure any more. One of the more basic but still very common issue SQL injection Niall asked a very basic question, why we often use the same connectionstring for both read and write operations. Why do you need write permissions to do a search? Very good question. Niall made a lot to think about and I think it was a wakeup call for most of the audience, to start thinking more about the solutions we make in our daily work.

Continuous Brain Food Delivery

October 1, 2013   

What makes a car drive? The motor.

What makes the motor run? Fuel

What do you do you do to keep the motor running? You continuously refuel the car

Now you might ask how this fit into a Development conference like GOTO;

What makes you learn? Your brain receives input

What makes the brain receive inputs? Energy

How do you get energy? You continously ensures to get the correct and right amount of food.

So what’s all this about? I’m sad to say but the food level at the GOTO conference is just way below average, for a conference at this price level. All participants pays quite a lot of money for the ticket and expect to get value for money. But if the brain is not running all day long you will not get all the output from the sessions, which you should for a conference with great sessions like this.

After having been to a couple of conferences for the last couple of years, I’ve seen different approaches of how to ensure we all get food and the right kind of food. Some uses prepacked launch boxes to reduce the food queue. Others ensures that there is always food available to keep you going and not have any peak situations in between sessions, continuous Food Delivery. Then there is GOTO, where you have ONE meal during a whole day of the conference and a total minimum of snacks during the breaks. You might argue we’re developers and we could live of only coffee and energy drinks, well time to  get back to reality. When you finally head in the direction to have launch, you’ll properly end up in a queue just to wait wait and wait, that’s not good enough.

Just as developing software where we try to learn the domain, the organisation behind GOTO, should start taking this topic serious, as this is THE major, but very important pitfall IMO. Location is great, the audience is great the speakers too, but the food really sucks. I really hope to see more continuous food delivery, just like NDC Oslo etc. It really works. At NDC you have food at all hours of the day and whenever you feel like, you can eat and your brain never shutsdown because of low energy.

Just to illustrate the difference betweeen GOTO and another development conference in Scandinavia.

IMG_1016 goto food queue

Windows Azure Mobile Services and the documents in the SQL Server

October 1, 2013   

Yesterday I attended a a session on Windows Azure Mobile Service WAMS, at GOTO Aarhus. All the great features of were demostrated and also how easy it is to store and retrieve data. One thing I still don’t get is why the storage underneath has to be a SQL Server. Has to be in the sense that when you create your Mobile Service on the management portal, this is is a requirement!

wams_sql

The usage of the storage is just like handling documents in a document database like MongoDb etc. so why not use the right tool for the right job? A SQL Server is not cheap to use and it’s not meant to be a document store. IMO it would be a huge advantage to make an option for a document database and get all the benefits. On the other hand you can also use parts of the WAMS like the scheduler and not even use the storage, so why should you create one in the first place? I understand the easy CRUD support, but I still think it should be optional whether you want a SQL Server storage or no storage at all. You could make a scheduled mobile service, that used any other service to retrieve its data and still make use of the super easy push mechanism inside WAMS. Don’t destroy a great service by making assumptions. If you store data with WAMS it’s also like a document storage and using the node scription on the management portal you feel kind on in the world of NoSQL, but you’re not. Looking forward to enable my Azure MongoLab AddOn on WAMS and use its aggregation framework to make it even more powerfull….

wams_services

GOTO; Day#1

September 30, 2013   

First day of the GOTO conference 2013 has come to an end. As always the first day of a conference is a mix of impressions. You meet a lot of old friends to catch up with and trying to aim for the right sessions to attend.

Syntaxation – Douglas Crockford

This was my first session of the day presented by Douglas Crockford. Douglas was very strict to his headline and presented various programming languages like Fortran, Fortran IV, BCPL and Agol 60/68 with a comparison of the evolution in syntax through times. Not that the language itself made the syntax a success or not, more that we as developers and users of a programming language has an emotional feeling about how a specific syntax should be. Even an optimization like minimizing a syntax by removing “{ }”, not ending lines with “;” is something we have “feelings” about. Douglas did a walk through using the if/else statement as his use case, which was simple enough to understand and still complex enough to express the various edition of how the different languages interpreters this.

The second part of the sessions was related to designing programming languages,and as as he stated

We Should All be designing programming laungages

A laungage should be innovative, not just yet another “CokeBottle cokeBottle = new CokeBottle(); laungage;”, make a difference. Make a language the does one specific thing very well and be focused. Make it non confusing to use and non cryptic. The session was well presented and made its point. As we are not all inventors of a programming laungages, we still have an opinion about the language we use, and that message was made clear.

Modern web app development with ember.js, node.js, and Windows Azure

Not one day without another noun dot js, is brought to live on a social media. Just like the headline for this session presented by Tim Park. This fast paced session covered the basics to make a twitter client consisting of ember.js, node.js and windows azure. The frontend was created with ember.js, a single page application framework that in its nature is a full blown clientside framework including model binding, templating and routing. It wasen’t a presentation specific to ember.js and I think Tom Dale would make more focus on the URL when creating an ember.js application than Tim did. The backend for this application was created using node.js, a very minimalistic backend without too much fuss. The backend provided a simple api using mongodb underneath by requering the mongoose node package. All the scaffolding was made using yeoman. A really powerfull framework, that also provided minifying etc. With both front- and backend in place, only thing left was deployment. This was where things got very interesting, as both parts was deployed to Windows Azure Webites without too much work,  just two  simple deployments from a local GIT repository.

At some point I think that both ember.js and node.js could deserve their own presentation, followed by a session like this to show the power of the azure platform. When mixed together like this, you are left with a great intro how the technologies can be used together, but still there might be too much of each, if deployment was the goal og the presentation.

Windows Azure Mobile Services

To follow up on the previous Azure session was this about Azure Mobile services, presented by Beat Schwegler. Beat presented how to make a simple windows store application using this particular service of the Azure platform. On the clientside he used the Nuget package to interact as simple as possible with this service, really easy. When you create a mobile service in the Azure management portal you start by either creating or selecting an existing  SQL Server. The idea behind this is that you get a very simple and easy to use API for basic CRUD operations towards this data storage. What might seem a bit odd is that the table(s) in the SQL Server is used as if they were a document storage, so why not use a document store instead? The tables are scaled horisontal as properties are added to the entities stored, in my opinion not the job for a SQL Server. What on the other hand looks really great about the way you interact with the storage, is that it’s done with node script. It dosen’t require much effort and with authentication setup on the client it is very easy to make the server side filter data based on the user of the API. So far the only downside is that the scripts are edited directly inside the Azure manage portal, and as far as I know it is the only solution for now, so much for source control….

Where the Azure Mobile Services really shows its full potential is when it comes to push notifications. First up it includes a scheduler service, which can be used to schedule the execution of a node script, just like a regular cron job.The script that could be used to at specific intervald to check data and if specific criterias are fulfilled send a notification to a user of a windows phone, IOS, Android or Windows Store app. The code to accomplish this is very simple compared to earlier ways of doing this, and you will save your clients a lot of battery by pushing data!

The conference app

To finish this first day of the conference, I’ll put my focus on the GOTO conference app. This year the app has been greatly imroved (if you are a ios or android user) and it’s stable :). Two major new features are voting and speaker questions. As previous years each session gets a rating and until today this has been applied by simply putting a colored piece of paper in a bucket at the exit. This is now put into the application. It works fine, as long as you remember to open the app and navigate to the session. The other new feature is the speakers questions. Previous years you could ask a question at the end of a session using a microphone. Now you can write your question during the session and submit it with the app. This makes it easy for every one to silently submit and get their questions answered without being too embarrassed to speak in public, in front of a lot of people, a really great improvement!