Yesterday I started playing around with AngularJS one of the many js frameworks for doing MV* to a webproject. Before I’ve always used basic jquery but had the feeling I wrote must “wire” code just to connect stuff. I’ve written a lot of frontend code before, using xaml and mvvmlight and really like the simplification the MVVM pattern adds to my client side code.
The project I’m using AngularJS for is a small mobile web app to track time. I’ve just started the project so one of the first features I wanted to achieve was to be able to have a simple counter that would display the time tracked. First I did the small logic using regular jquery and js setInterval. When I started to refactor I just added the same code to my new AngularJS controller, meaning I would have a mix of functions inside the controller, some being part of the controller and some not.
The HTML that binds to the Angular controller looks like this
The <h1> tag gets a oneway binding to the controller, exposing a method called getDuration. The reason I’m using a function and not a field is, that I want to be able to update the value after the initial binding is created, meaning this is not a “OneTime” binding.
The Controller has the function defined as this
After adding the update of the field I went along and refreshed my browser and started the timer, nothing happend……What? As a simple debugging I went to chrome developer tools and even added a console.log($scope.duration) to the code. It Worked! But the UI still didn’t update??. After some hours of searching and watching AngularJS videos on youtube I finally found the needle I was searching for.
The reason why this didn’t work, was because I was trying to update the $scope from outside the $scope as I updated the field inside the updateTimer function. By default this will not update bindings on the $scope! So to ensure that bindings are updated when you are outside the $scope you need to manually call $scope.$apply().
As I found this “gotcha” I also came across the $timeout service in Angular, which will be the next subject for refactoring towards using AngularJS. The smart thing about using this service, is that I don’t have to call apply on the $scope manually and the usage is a bit more simple. But more about that in a later blog post.
Seeding an identity in SQL Server is quite easy and can be applied both during creation of a column but also after. When creating a new column in Sql Server Management Studio you can select the seed value for a given column. Likewise it is also possible to do the same by scripting
If Seeding is not applied upon creation, a re-seed can easily be applied afterwards, by using the DBCC CHECKIDENT as illustrated. You specify the table that should have itsidentity re-seed and the value it should be seeded to. In this case 1001.
Moving to Azure Sql and create/change seeding is currently not supported, among other functionalities available in Sql Server on premise. To accomplish the same functionality it is neccessary to do a small trick. First we need to temporarily turn off the identity, make a manually insert of data with the identity we want to force the next identity to continue from here. Finally we turn on identity again. All together it looks like this (using Entity Framework migrations)
Seeding has now been applied to Azure SQL.
When you are a web developer, one of the great features inside Visual Studio is the ability to transform web config files. What it means is that depending on the configuration of your project debug, test or release etc. the file is transformed during publish and any changes you made to the selected configuration of web.config, will be applied to the resulting file. One of the most common use cases is to replace the value of the connectionstring.
This is cool if you are a web developer and only uses the web.config file, but what if you are a windows developer having an app.config or just like to place your log4net configuration inside a seperate file log4net.config, even as a web developer? By default you are not able to apply transformations to this file. Luckily for us, Sayed Ibrahim has made a nice tool called SlowCheetah. Its a tool that has been around for quite some time. It enable us to use transformations on anything inside visual studio that has the structure of an xml file. Until now the integration of SlowCheetah into a project has been a little tricky. First there was the installation of the tool, well quite simple and after installation you get this new option inside visual studio.
A addon to the context menu where you can add a transformation to an existing file. And further more when you have created a transformation, you can right click it too and now you are able to see the resulting transformed file!
Now all of this seems quite cool and it is, but one downside has been the usage of these added transformations capabilities on CI servers. By default the installation of SlowCheetah references your local developer machine’s local folders and this information is added to your project file. Of course this would not work on a CI server and one of the workarounds has been to add the SlowCheetah files (dll+targets) into a solution folder of your solution and “simply” edit the project files to reference these files instead of the ones you installed. This works and I’ve been a proud user of this approach for the last year or so.
No more hacking of projectfiles
Well now all of this hacking in project files has been done for the last time! Just a couple of days ago, SlowCheetah was published to Nuget and the SlowCheetah as a Visual Studio tool can also be found as an extension to visual studio.
When added to Visual Studio and used you are required to Enable Package Restore for your solution as the extension will add the files from nuget to your solution. After this very easy installation you now has the same capabilities of transformation as you did before, but if you look into the project file you should notice the difference.
First there is the propertygroup, that now references a nuget package. This is where we had to make changes before as it was by default referencing the local developer machine. Second there is the import statement in the bottom of the file. The files referenced can be found inside the imported Nuget package
So what we now got is and easy to use SlowCheetah with no quirks to get it up and running on a CI environment out of the box, lets go transform!
Frontend, Backend, junior, senior , ninja….developer are all terms that are used to describe todayes developers. When a company seeks new developers the often tend to describe what the seek is something that could be placed inside a layer archicture in some software system. just make me wonder why? Are you really sure this is the person you are looking for? Why not a dynamic persone that should have the capability to adopt to the environment he is in?
What is a frontend developer actually? Is it the guy doing all the graphics, hacking all day in photoshop make smooth graphics and awesome texts? Is it the person thats only knows about html and whenever something hits an API he just starts to cry. And the backend developer, should he be a prisoner inside the walls, just tweaking and optimizing the cores servers and processes where the hot business code is running?And finally the junior vs. senior. Is it the new hipster just graduated meeting the old grumphy guy running on his 10th year in the company?
As a developer this also puts a pointer in the direction of cross systems technologies, we should be able to work with, and its great. It allow us to always learn new stuff, just as all of us loves to do. When you really need a hot MS SQL developer the be specific, reduce the amount of work you put into hiring, just be sure to aim right.
The point from this small braindump is a shoutout to campanies seeking developers. Think about what you really need, think about what you write in your advertisement are you sending the right message and reaching the right audience?
What happens when managers starts measuring softwareteams and talking about results? They hide and become nervous. This is not what MDD is about! At todays Warm Crocedile Conference Mantas Klasavičius made his point of view about what it takes to do MDD based on his own expirence. Measurement should NOT be something made on a management level, it should instead be something that is made on all levels during the whole software development life. From the first lines of code to the product is live. But why at all do MDD and why should I as a developer care? MDD should focus on the product to improve and ensure that you can be proud of what you are doing all day long. But what and how to measure?
We all make features, but are we making the features the customer needs? Often we have a backlog and make the feature that the customer marks as most valuable. When the code is complete, the tests are green and the feature is deployed, we tend to forget and now its just out there. But what was the actual value of the feature to the customer. Did this really really important must meet deadline that important? did the customer really earn more money, get more customers and are they at all using the feature? Why not help even more than just develop features, why not help them measure if this really was the feature they wanted. What if we could make their prioritasion even better next time and help them pick a more valuable feature? This is where metrics and measurement comes into the picture. Make it simple, write simple metric log statements to a storage and analyse. Its not cool to write code, its cool to write code that makes features, to real people and real value.
Looking at how we as a team make a feature could also be an internal team metric. Did the feature follow the concept of Minimum Viable Product? Did we make unnesssecary technical creep and wasted time? What was the Time To Bool, the time it took for us to make the feature “As a customer I want to be able to select remember my password”. Should this checkbox feature really take a week to develop and maybe one more to deploy?
One thing is sure, if we don’t measure how should we be able to say that we are improving when we say we do? And then comes the scary part, start measuring. Make it something that we just do, as we should always improve our skills. Start small, do not try to make it all at once. Gather the team, have each teammember write 2 items they think we should focus on and the select one to focus on for the next sprint. Find out how you want to gather the metrics. Don’t get fooled by all the tools available, make it as simple as possible to get the data you want and are able to use. Find your current level and set a goal. Celebrate when you reach your goal before setting a new goal or maybe one more metric.
My metric braindump list
- Application startuptime,
- Time to first user interaction
- Deployment time
- Test time (unittest, integration test etc.)
- Time To Bool
- Feature usage
- Issues per release
Be sure to care about the feedback, nobody like blame and nobody intended to deliver low quality or even broken software.
Case – Using Entity Framework from an Azure Web Role is a quite common case. Running it local using the Development Emulator is as easy as….
Problem – Access Denied: Running a simple ASPNET MVC site with Entity Framework and azure is easy. So moving to azure requires a web role that references the web project. Run the project having the Entity Framework initialize and maybe even create the database….Access Denied What? As mentioned in Azure Hack #2.0 the Development Emulator runs under the NETWORK SERVICE account. This means that this account also needs to be granted the right permissions to the SQL database before Entity Framework can do its job.
Solution – To grant access open Sql Server Management Studio and look for logins. Add a new login to NETWORK SERVICE. Be sure to select the database if it already exists to the mapping. The result should be like this
Running the project again and Entity Framework should now be able to perform its work.
Using a Certificate locally and use https
Case – Moving from one developer machine to another or simply just start using https requires to use a certificate.
Problem – No access or missing certificate. Setting up https for an Azure Web Role is quite simple. First a certificate needs to be created, just a development certificate for this purpose. The Certificate should be referenced from the web role and exists on the developer machine. Moving to the new machine requires the certificate exported and installed. This is simply done by using the Microsoft Management Console. Be sure to install it for local machine. Microsoft has a nice description for completing this task. The final result should look like this
Parser Error Message: ID1024: The configuration property value is not valid.
Error: ID1039: The certificate’s private key could not be accessed. Ensure the access control list (ACL) on the certificate’s private key grants access to the application pool user.
Solution – No access? Well the webrole is running under the NETWORK SERVICE account. So to of course the account also should be granted access here. how? Back to the installed certificate in Microsoft Management Console (MMC). Select the certificate and Right click –> All Tasks –> Manage Private Keys. Lookup up the NETWORK SERVICEaccount and add it.
Back to Visual Studio and run the solution. Now it works and using https.
For some time I’ve been working on an Azure project. The project is based on ASPNET MVC, uses a WebRole, table and SQL storage and secured with https. For a long time it just worked. After some changes and moving to a new development machine the “just working” started to change, it simply didn’t work. The headache and time used with Google learned me some tricks so this post is the starting point of sharing some basic hacks that I want to share with other developers using the Azure platform. And please feel free to add even more hacks by using the comment.
The Move To A New Developer Machine – Azure Hack #1.0 Development Emulator Crash
Case: I do my development on different development machines so moving to a new shouldn’t be an issue. Lately I created a new machine running Windows7, Visual Studio 2012 and Azure SDK/tools. Cloning an existing repository from GitHub should be just enough to get the solution to build a on the other machines.
Problem – Development Emulator crash on startup: Starting the new solution triggered the Development emulator and problems started. Upon start the emulator crashed and reported the following error:
I tried reinstalling the azure tools but no luck. Then some search with Google led me to this post. So I wasn’t the only one having problems.
It worked. Now the Azure Development Emulator started.
Three days ago I arrived at the GOTO Conference 2012, ready to be blown away by speakers and content in a mix of networking with the rest of the developer community. Now three days later its all over, so what was it that made GOTO the conference it was this year?
Content is king, that’s it no more no less. Among all the sessions I went to this year I think the level was somewhere below earlier years overall. When I go to a session, I expect to get some value I can bring with me afterwards. It doesn’t have to be a specific technology, just something that will make me think in a good way, make me improve or challenge the work I do everyday as a developer, as a person. Right now I don’t have that feeling from all the sessions. Some sessions had a very impressive abstract explaining the topic for the session but failed in execution.
A great line of speakers went to the GOTO Conference this year, To me names like Scott Hanselmann, Hadi Hariri, Linda Riesing, Dan North and many mores really shows the potential of a conference like this. The most important is to have a speaker that knows how to and likes to inspire other persons. People who care about the stuff they are doing.
With this great line of speakers I think the People at GOTO should rethink where place the specific sessions. A session like the one Hadi Hariri had Tuesday, was placed on the 2nd floor in a way to small room to a speaker like him. There wasn’t enough room for all the people who wanted to attend. Compared to other speakers and topics where the sessions was located in the largest room with a very little audience.
When I’m attending a conference of important topic is refreshments. Often the temperature get a bit below normal because of the number of people, you sit down all day listening and concentrating. Some has a long way to travel, so refreshments are important. And I think GOTO failed! Arriving in the morning you are lucky if coffee machine number 4 in a row is not empty, and then a little breakfast? No no food before later on, really disappointing. Well at least there was always fresh fruit and other snacks in the breaks…. No! I’m not saying we should just eat and eat, but to get most value from a session you need to make sure people are not missing food. The launch this yeah was kind of OK, no more, not great.
One thing that I’ve seen at other conference is having the attendees presenting demos in the end of the day. Its also what helps build up a great community. I really think GOTO should try this next year, bring in all the goodies.
This year GOTO made it possible for people to join the conference and see the vendor sessions and keynotes for “free”. This is a great idea as the conference is very expensive compared to others. Another idea could be to make a “Ticket” system where it was possible to buy tickets to a number of sessions. Many people find session spread across the entire conference but because of the expensive tickets will only attend a single day or and maybe not get as much value from the conference as if they were able to pick the sessions they wanted to attend.