Software Development, Agile Principles and Entreprenuership Meets Real Life
Show Menu

How To Use Google Analytics From C# With OAuth

March 2, 2014   

logo Many people knows about and uses Google Analytics for their websites and apps to collect user behavior and other useful statistics. Most of the time people also uses the collected information directly from the Google Analytics website. On one of the recent project I’ve been working on we needed to extract data from Google Analytics running a C# service with NO user interaction aka a background service. This blog post is a summary of the roadblocks I stumbled upon to get the job done. All code depends on the Google Analytics V3 API and the source code for this post is available on GitHub.

Setting up the C# project and adding Google Analytics dependencies

For the purpose of this sample we will use a standard C# console project, note that I’m running Visual Studio 2013 and .NET 4.5, This will have an impact described later. When the project is created we need to include the Google Analytics nuget packages. The package is currently i beta, but I haven’t had any issues using it (The Google Analytics v3 is not in beta). From the Visual Studio Package Manager Console search for the Prerelease package named Using the following command

get-package -Filter -ListAvailable –prerelease

To install the package execute the following command

get-package –prerelease

There is quite a lot of dependencies, so I won’t post a screenshot. The important part is just to ensure that the –prerelease option is included. To proceed from here we need to get the needed keys from Google Analytics and setup OAuth Authentication.

Setting up Google Analytics to allows read access from a native service application

To extract data from Google Analytics, we need to make a project inside Google Analytics. This project is used to manage the access and keeping track of the request usage, which we will look at later. Go to the Google Developer Console and create a project. The project will be listed in the overview, here you can see my project named “My Project”ga_project_list Selecting the project after it is create displays the following screen where we need to enable the Analytics API. If you need access to other APIs later on, this is the place to go.enable_GA

Enabling OAuth access to the project

For this project we want to use OAuth for authentication. Google provides multiple ways of doing depending on the use case. From the Credentials screen select “Create New Client Id”.create_clientid Here it’s important to select  Installed application and other. Click Create Client Id, and we get back to the project overview with the new client id listed. Record both the Client ID and Client Secrect as we need both these later from the program. That’s it, now we are ready to start coding and access the API! client_id_list

Setting up AnalyticsService in C#

Going back to Visual Studio and the project we created earlier, insert the following code into the main of the program. Note that this is only sample code :).

class Program
 static string clientId = "INSERT CLIENTID HERE";
 static string clientSecret = "INSERT CLIENTSECRET HERE";
 static string gaUser = "";
 static string gaApplication = "My Project";
 static string oauthTokenFilestorage = "My Project Storage";

private static AnalyticsService analyticsService;

private static void Main(string[] args)
 var credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
 new ClientSecrets
 ClientId = clientId,
 ClientSecret = clientSecret
 new[] {AnalyticsService.Scope.AnalyticsReadonly},
 new FileDataStore(oauthTokenFilestorage)

var service = new AnalyticsService(new BaseClientService.Initializer()
 HttpClientInitializer = credential,
 ApplicationName = gaApplication

What’s happening here is we create the AnalyticsService, which is the service we are going to use when sending requests to Google Analytics. We provide it with the informations we got earlier creating the project inside the Google Developer Console. Also note that for this purpose we are using the scope AnalyticsReadOnly. Finally we are giving an OauthTokenFileStorage. This is the storage that google will use to store the oauth token which are issued during each requests between Google Analytic and our application. On your machine this will be physically located at %AppData%StorageName where storage is the name we specified using OauthTokenFileStorage. It is possible to change this to your own implementation using a database etc. But that will be covered in a later blog post, for now we will just use the file storage.

Ready to Authenticate with Google Analytics

When the information in the source code is updated to match your own settings, we are ready to test that we can initialize the oauth authentication. As mentioned earlier we will run this service  with no user interaction. However the first time we need to run it, we need to do it “manually” meaning run it from a console. This is needed as we will be prompted to allow access and get the initial access_token but more over the refresh_token. The refresh token is the one used when the access token expires to renew it. When we run this the first time we will get the token information in a file inside the oauth storage specified by our app. From here on, we can copy it to where ever we want, both to our own machine of it could be to a server running our service. So the user interaction is only a one time thingy. If you change Account or Profile id (developemnt, staging and production etc.), you need to redo this step! Hit F5 and our application starts and….crash!.

Could not load file or assembly ‘Microsoft.Threading.Tasks.Extensions.Desktop, Version=, Culture=neutral…….

Now what just happened? Because we are running .NET 4.5 things have changed a bit. If we do a little investigation (Thanks to Google and Stackoverflow) we can see that this file should be included by the Microsoft.BCL.Async  Nuget Package A quick look into the packages\Microsoft.Bcl.Async.1.0.165 unveils we have a later version and that the assembly is NOT included in the 4.5 version of the packagebclmismatch To ensure we can authenticate add a reference to the 4.0 file we are missing.Simply browse to the package you included and you’ll find both the 4.0 and 4.5 assemblies. Next up add an assembly binding redirect in the projects app.config file to redirect to the previous versionassemblybinding

Now with both the assembly reference and assemblybinding in place, run the project again. This time the project launches a browser where we are asked to allow this application to access to your Google Analytics data.


If we take a look into the OauthTokenFileStorage located in %appData%YourStorageName We should now see a file containing our token information accesstoken, refreshtoken and some other information. Remember that the most important part here is the refresh token, as this one is used to renew the access token, whenever it expires.

The 1st Request To Analytics Data

By now everything is ready and we can initiate the first request. Enough writing let’s go the code first way.

 string start = new DateTime(2014, 1, 1).ToString("yyyy-MM-dd");
            string end = new DateTime(2014, 1, 10).ToString("yyyy-MM-dd");
            var query = service.Data.Ga.Get(profileId, start, end, "ga:visitors");

            query.Dimensions = "ga:visitCount, ga:date, ga:visitorType";
            query.Filters = "ga:visitorType==New Visitor";
            query.SamplingLevel = DataResource.GaResource.GetRequest.SamplingLevelEnum.HIGHERPRECISION;

            var response = query.Execute();

            Console.WriteLine("Entries in result: {0}", response.TotalResults);
            Console.WriteLine("You had : {0} new visitors from {1} to {2}"
                , response.TotalsForAllResults.First(), start, end);
            Console.WriteLine("Has more data: {0}", response.NextLink == null);
            Console.WriteLine("Sample data: {0}", response.ContainsSampledData);

Building the request

For the purpose of this post we will just make a simple query that will “Get the new visitors to my blog and not the returning ones”. First up we will use the GET method of the AnalyticsService. This method takes a profileId, the date range we are requesting within and finally the metric we want to measure. The profileId is the unique id describing the “table” to the account we are requesting. To get the profileId, navigate to the administration section in your Google Analytics account. From here you click the “View settings” of the View column and you’ll see the ProfileId mentioned as View Id viewsettings profileIdWith the query created, we are able to define the dimensions (output) and additional filters to further narrow down the result set. For the purpose of this blog post I’ve added New visitors as a filter, to filter away Returning visitors. Before looking at the response result format, please take a look at the format of the input parameters. Both profileId, metric(s), dimension(s) and filter(s) are all prefixed with ga: This is mandatory and if not you will get really weird errors. For a detailed description take a look at the Core Reporting API – Reference Guide. Same goes for the date format.

Handling the response

Ones the request is ready we call Execute it to get the response. The result contains very detailed meta information besides the actual result. As you can see I’ve listed 4: TotalResults, TotalsForAllResults, NextLink and ContainsSampledData. TotalResults is a count of the entries contained in this response, TotalsForAllResults is a sum of all according to the metric. NextLink is an indication of whether we got all results in this response, and if not it points to the next result set. There is a limit of 1000 results per request, but the NextLink makes it very easy to navigate to the next result set. Finally the ContainsSampledData is an indication of whether the result is sampled or not. To get a more precise extract you should limit the request date range and also set the SamplingLevel of the request to HighPrecision.

Debugging requests

While building the requests the Google API Explorer comes in very handy. Here you can test all your input and investigate the request and its details.

At the end of the road there is a stop sign

My final note on the basic request setup is a heads up to the default request Limits and Quota defined by Google. Please get a good understanding of where your limits are, both in number of requests and frequency. And when you exceed them as I did this Error Responses is a good place to start implementing a more solid foundation. To follow your request usage of the current project/profile, use the Old Console Monitor access able from the Developer Consoleaccess_monitor Click the APIs, select the Analytics API we enabled earlier and you’ll see the menu to access both the quota monitor and the Report. The Report is the access to monitor your current and previous usage and the quota will enable you to increase request frequency etc.monitor  So far the new Cloud console (see the link in the screenshot) doesn’t seem to monitor correctly and the update frequency is very low. For now use the “old” one, where you can also download a request report on a day level see failed requests etc. I hope this demystifies and guides to a better understanding of the basics to access Google Analytics from C#. All source code is available on GitHub

Execute Book

November 16, 2013   

Ok I’ll admit, one of my challenges as a developer is to execute. Often I get very curious in new technologies, which might harm the execution of the product I’m working on. I’m aware of it and after reading this great book I really got inspired and got some take aways to improve my self.

This great book by Drew Wilson & Josh Long is written in only 8 days! It’s about a product build within only 8 days I read a tweet about the book, watched the video and quickly identified my self :). I bought the book and 1½ week later I’ve read it from start to end, very easy to read and a loy all the way.

I really like how the book is built and you can almost feel the energy of the writer as you read along. It might be easier for a software entrepreneur to identify with the content. You will get a lot of input to how and why to forget your old habits and start to execute. Do the thing you feel and do it now all you need is focus. There isen’t always a need to make the big plans as this will often be a roadblock towards shipping what you’re building no one knows the future and you can’t plan what your product will be inthe future. Ship and get feedback.

I think there is more than one target group for this book. First there is the entrepreneurs who wants to create and ship products, but the book might as well be an inspiration in software companies building projects for clients. The latter part of the book has some great perspective on the use of technology and how to get you moving forward. Interesting enough the role of the “builder” in the book could be named a fullstack developer, one that cares and can work en every phase of a products lifecycle. You will learn as you go along and features should be what drives your learning.

I really enjoyed reading the book and would recommend it to everyone who’s interested building products and ship’em.

Buy the book here.

The $100 Startup – value for every cent

October 27, 2013   


For some time I’ve been reading this fantastic book by Chris Guillebeau, finally finished and here are my thoughts.

I originally found the book based one some references at Amazon, and thought of it as a fast read and an energy kick. I wasen’t let down.
Overall the book describesa lot of startups in different business area, all started based on a very little amount of money, the $100. The idea behind most of them has not been to create a million dollar business, but to follow a passion for each of the people behind and change their way of living. One remarkable thing is that it is not yet another Facebook story about a group of IT students. It’s stories among all level of society from the homeless, to the manager who just love to travel, and the mom who wants to spend more time with her children and family. Follow your dreams and trust in it.
A big part of the book is written as guides, also available on the website . Easy to get going guides, the micro business plan, how to think about monitize your business and later how to increase your sale as just a few example. Exactly this part is what makes the book excellent and also makes it as a reference. I personally know that I will reread some of the chapters later again. The hints are so easy to understand and bring to live for every one. While reading the book I had to take a break from time to time, because I started to think! Think about all the inputs I got from the book and started to see if I could use them in some way for my own ideas of a business. I would recommend this book to anyone interested in entrepreneurship and small businesses, you wont be let down.

Find the book here

The books homepage 100startup

The Attackers Are Here, don’t wait!

October 1, 2013   

Don’t wait, when you deploy your online you are a subject to attack. Web security as one of the main topics at GOTO today, and could very well be explained by this very simple comic

First up Aaron Bedra did a great talk talk about some of the defence mechanism you can use to protect your online services and especially how to look for anomalies in the requests you receive. He demostrated ModSecurity a rule based web application firewall. But ModSecurity can not stand alone, instead it should be used together with Repsheet etc.

Repsheet is a collection of tools to help improve awareness of robots and bad actors visiting your web applications. It is primarily an Apache web server module with a couple of addons that help organize and aggregate the data that it collects.

Aaron demostrated different indicators to look for in incoming requests, and as indicators to fraud etc. stacked up in the request, the chance for the request being an attempt to attack increases. Aaron really made his point and demostrated som great examples. The RepSheet really seems like a great product an could be a great addon to PaaS providers like Azure.

The second part of the web security sessions was faced towards the possible attacks one could make to an online service. The session was presented by Niall Merrigan and covered the items in the OWASP top 10. As he started to state

Developers don’t write secure code because we trust, are lazy and security is hard

And this sound just about right, but as he pointed out example by example, we might consider to understand just a bit more about security and put an effort into it. Very simple he googled for web.config and got access to some running services’ web.config files, ready to be downloaded. Later he pointed out that IIS would protect you from having your web.config file downloaded. But with a simple trick, there is a basic workaround and suddenly your web.config information like connectionstrings etc. are public and could easily be exploited.

No clear text passwords!!

Of course you could hash your passwords, but if one already got your web.config file, he also got your machinekey and suddenly the world is not that secure any more. One of the more basic but still very common issue SQL injection Niall asked a very basic question, why we often use the same connectionstring for both read and write operations. Why do you need write permissions to do a search? Very good question. Niall made a lot to think about and I think it was a wakeup call for most of the audience, to start thinking more about the solutions we make in our daily work.

Continuous Brain Food Delivery

October 1, 2013   

What makes a car drive? The motor.

What makes the motor run? Fuel

What do you do you do to keep the motor running? You continuously refuel the car

Now you might ask how this fit into a Development conference like GOTO;

What makes you learn? Your brain receives input

What makes the brain receive inputs? Energy

How do you get energy? You continously ensures to get the correct and right amount of food.

So what’s all this about? I’m sad to say but the food level at the GOTO conference is just way below average, for a conference at this price level. All participants pays quite a lot of money for the ticket and expect to get value for money. But if the brain is not running all day long you will not get all the output from the sessions, which you should for a conference with great sessions like this.

After having been to a couple of conferences for the last couple of years, I’ve seen different approaches of how to ensure we all get food and the right kind of food. Some uses prepacked launch boxes to reduce the food queue. Others ensures that there is always food available to keep you going and not have any peak situations in between sessions, continuous Food Delivery. Then there is GOTO, where you have ONE meal during a whole day of the conference and a total minimum of snacks during the breaks. You might argue we’re developers and we could live of only coffee and energy drinks, well time to  get back to reality. When you finally head in the direction to have launch, you’ll properly end up in a queue just to wait wait and wait, that’s not good enough.

Just as developing software where we try to learn the domain, the organisation behind GOTO, should start taking this topic serious, as this is THE major, but very important pitfall IMO. Location is great, the audience is great the speakers too, but the food really sucks. I really hope to see more continuous food delivery, just like NDC Oslo etc. It really works. At NDC you have food at all hours of the day and whenever you feel like, you can eat and your brain never shutsdown because of low energy.

Just to illustrate the difference betweeen GOTO and another development conference in Scandinavia.

IMG_1016 goto food queue

Windows Azure Mobile Services and the documents in the SQL Server

October 1, 2013   

Yesterday I attended a a session on Windows Azure Mobile Service WAMS, at GOTO Aarhus. All the great features of were demostrated and also how easy it is to store and retrieve data. One thing I still don’t get is why the storage underneath has to be a SQL Server. Has to be in the sense that when you create your Mobile Service on the management portal, this is is a requirement!


The usage of the storage is just like handling documents in a document database like MongoDb etc. so why not use the right tool for the right job? A SQL Server is not cheap to use and it’s not meant to be a document store. IMO it would be a huge advantage to make an option for a document database and get all the benefits. On the other hand you can also use parts of the WAMS like the scheduler and not even use the storage, so why should you create one in the first place? I understand the easy CRUD support, but I still think it should be optional whether you want a SQL Server storage or no storage at all. You could make a scheduled mobile service, that used any other service to retrieve its data and still make use of the super easy push mechanism inside WAMS. Don’t destroy a great service by making assumptions. If you store data with WAMS it’s also like a document storage and using the node scription on the management portal you feel kind on in the world of NoSQL, but you’re not. Looking forward to enable my Azure MongoLab AddOn on WAMS and use its aggregation framework to make it even more powerfull….


GOTO; Day#1

September 30, 2013   

First day of the GOTO conference 2013 has come to an end. As always the first day of a conference is a mix of impressions. You meet a lot of old friends to catch up with and trying to aim for the right sessions to attend.

Syntaxation – Douglas Crockford

This was my first session of the day presented by Douglas Crockford. Douglas was very strict to his headline and presented various programming languages like Fortran, Fortran IV, BCPL and Agol 60/68 with a comparison of the evolution in syntax through times. Not that the language itself made the syntax a success or not, more that we as developers and users of a programming language has an emotional feeling about how a specific syntax should be. Even an optimization like minimizing a syntax by removing “{ }”, not ending lines with “;” is something we have “feelings” about. Douglas did a walk through using the if/else statement as his use case, which was simple enough to understand and still complex enough to express the various edition of how the different languages interpreters this.

The second part of the sessions was related to designing programming languages,and as as he stated

We Should All be designing programming laungages

A laungage should be innovative, not just yet another “CokeBottle cokeBottle = new CokeBottle(); laungage;”, make a difference. Make a language the does one specific thing very well and be focused. Make it non confusing to use and non cryptic. The session was well presented and made its point. As we are not all inventors of a programming laungages, we still have an opinion about the language we use, and that message was made clear.

Modern web app development with ember.js, node.js, and Windows Azure

Not one day without another noun dot js, is brought to live on a social media. Just like the headline for this session presented by Tim Park. This fast paced session covered the basics to make a twitter client consisting of ember.js, node.js and windows azure. The frontend was created with ember.js, a single page application framework that in its nature is a full blown clientside framework including model binding, templating and routing. It wasen’t a presentation specific to ember.js and I think Tom Dale would make more focus on the URL when creating an ember.js application than Tim did. The backend for this application was created using node.js, a very minimalistic backend without too much fuss. The backend provided a simple api using mongodb underneath by requering the mongoose node package. All the scaffolding was made using yeoman. A really powerfull framework, that also provided minifying etc. With both front- and backend in place, only thing left was deployment. This was where things got very interesting, as both parts was deployed to Windows Azure Webites without too much work,  just two  simple deployments from a local GIT repository.

At some point I think that both ember.js and node.js could deserve their own presentation, followed by a session like this to show the power of the azure platform. When mixed together like this, you are left with a great intro how the technologies can be used together, but still there might be too much of each, if deployment was the goal og the presentation.

Windows Azure Mobile Services

To follow up on the previous Azure session was this about Azure Mobile services, presented by Beat Schwegler. Beat presented how to make a simple windows store application using this particular service of the Azure platform. On the clientside he used the Nuget package to interact as simple as possible with this service, really easy. When you create a mobile service in the Azure management portal you start by either creating or selecting an existing  SQL Server. The idea behind this is that you get a very simple and easy to use API for basic CRUD operations towards this data storage. What might seem a bit odd is that the table(s) in the SQL Server is used as if they were a document storage, so why not use a document store instead? The tables are scaled horisontal as properties are added to the entities stored, in my opinion not the job for a SQL Server. What on the other hand looks really great about the way you interact with the storage, is that it’s done with node script. It dosen’t require much effort and with authentication setup on the client it is very easy to make the server side filter data based on the user of the API. So far the only downside is that the scripts are edited directly inside the Azure manage portal, and as far as I know it is the only solution for now, so much for source control….

Where the Azure Mobile Services really shows its full potential is when it comes to push notifications. First up it includes a scheduler service, which can be used to schedule the execution of a node script, just like a regular cron job.The script that could be used to at specific intervald to check data and if specific criterias are fulfilled send a notification to a user of a windows phone, IOS, Android or Windows Store app. The code to accomplish this is very simple compared to earlier ways of doing this, and you will save your clients a lot of battery by pushing data!

The conference app

To finish this first day of the conference, I’ll put my focus on the GOTO conference app. This year the app has been greatly imroved (if you are a ios or android user) and it’s stable :). Two major new features are voting and speaker questions. As previous years each session gets a rating and until today this has been applied by simply putting a colored piece of paper in a bucket at the exit. This is now put into the application. It works fine, as long as you remember to open the app and navigate to the session. The other new feature is the speakers questions. Previous years you could ask a question at the end of a session using a microphone. Now you can write your question during the session and submit it with the app. This makes it easy for every one to silently submit and get their questions answered without being too embarrassed to speak in public, in front of a lot of people, a really great improvement!

One simple contribution

September 16, 2013   

Last year I attended GOTO; as blogger for the first time. This was also the first time the conference made it possible to attend as a blogger. Normally when I blog, I write because I have something I want to share because I think it might be useful for others to know. It is often technical related like my small Azure hacks, but when it comes to  blogging during a conference is is often either the take aways directly from the talk or my own reflection on a topic. Last year at GOTO; this resulted in these blog posts:

As I’ve written before a conference it not only about the sessions. To me and I guess a lot of other people, a conference is also about meeting new people. Related to this I attended NDC Oslo 2013 earlier this year, A conference a bit similar to GOTO; During one of the breaks, I was talking to Pluralsight in the exhibitor area. As usual he asked where I came from if I knew Pluralsight and everything a sales guy would ask you, to try and get your money :). But the most interesting part was that I got the chance to tell about the developer community in Denmark and…. GOTO; During next break the guy from Pluralsight got a hold on me and told that they had just signed up for this years GOTO; Now that is so cool and today you can find Pluralsight on the sponsor page of GOTO; pluralsightMy advice it to do whatever you feel is the right to add your contribution to the community you are a part of. Even the smallest thing will bring value!

Next up GOTO;

September 15, 2013   

So autum i just around the corner and so is this years GOTO; conference i Aarhus. For me this conforence is a recurrent event at this time of the year and again I’m loking forward to some great days with a lot input and networking.

To prepare the sessions I’m expecting to participate in this year, I’ve taken a look of each speaker description and also the headlines of their sessions. Based on this my top 5 presentations for this year in random order are:

  • Douglas Crockford – Syntaxation
    • I’ve never attended a session but as many others, only heard so much good.
  • Dan North
    • Almost at every conference I’ve been to there has been some agile sessions. Often it’s like scurm does this, if you use kanban you will be so much … e.g. Many of them tend to dictate the books, but not all of them is able to “sell” me their message, Dan is! He really knows how to convience and influence his audience, with real life expirence and not just some school books.
  • Brent Beet – How GitHub build products
    • I must admit that I really admire GitHub. The culture, they way the build the company and all the awesome people working there, Phill Haacked, Zach Holman just to mention a few. I’m expecting to get a lot of input to my own daily work from this session
  • Bodil Stokke – Programming Only Better
    • Earlier this year I attended NDC 2013 and missed the session by Bodil. Afterwards I heard so much publicity about her presentation that I don’t want to miss it again. The topics Clojure, Haskell and so on might be topics that I’m not using directly, but the best take aways I have from conferences is inspiration. And a presentation like this with a great presenter and a topic a bit out of my hands sure does fit this.
  • Beat Schwegler – Windows Azure Mobile Services – A node.js based service backend for Android, iOS, HTML5 and Windows apps
    • This is just one of few Azure sessions. I really love the Azure platform and think it’s growth over the last couple of years is so cool. One topic I haven’t used that much is the mobile and the same goes for node.js. I hope to get some great insights and be inspired to expand my Azure knowledge a bit more.


Besides the sessions I also hope to have some great networking days and as GitHub is also sponsoring the conference party I can’t see why this should not be a success :)

Modifying AngularJS $scope from outside the $scope

March 23, 2013   

Yesterday I started playing around with AngularJS one of the many js frameworks for doing MV* to a webproject. Before I’ve always used basic jquery but had the feeling I wrote must “wire” code just to connect stuff. I’ve written a lot of frontend code before, using xaml and mvvmlight and really like the simplification the MVVM pattern adds to my client side code.

The project I’m using AngularJS for is a small mobile web app to track time. I’ve just started the project so one of the first features I wanted to achieve was to be able to have a simple counter that would display the time tracked. First I did the small logic using regular jquery and js setInterval. When I started to refactor I just added the same code to my new AngularJS controller, meaning I would have a mix of functions inside the controller, some being part of the controller and some not.

The HTML that binds to the Angular controller looks like this


The <h1> tag gets a oneway binding to the controller, exposing a method called getDuration. The reason I’m using a function and not a field is, that I want to be able to update the value after the initial binding is created, meaning this is not a “OneTime” binding.

The Controller has the function defined as this

exposeduration This will make the UI display an initial value of “00:00:00” when the binding is created. But as I want to update the value when my timer tick, I should just update the $scope.duration variable. The update function referenced from the javascript method setInterval looks like this and updates the $scope.duration field

updateNoApply After adding the update of the field I went along and refreshed my browser and started the timer, nothing happend……What? As a simple debugging I went to chrome developer tools and even added a console.log($scope.duration) to the code. It Worked! But the UI still didn’t update??. After some hours of searching and watching AngularJS videos on youtube I finally found the needle I was searching for.

The reason why this didn’t work, was because I was trying to update the $scope from outside the $scope as I updated the field inside the updateTimer function. By default this will not update bindings on the $scope! So to ensure that bindings are updated when you are outside the $scope you need to manually call $scope.$apply().

withApplyAs I found this “gotcha”  I also came across the $timeout service in Angular, which will be the next subject for refactoring towards using AngularJS. The smart thing about using this service, is that I don’t have to call apply on the $scope manually and the usage is a bit more simple. But more about that in a later blog post.