Bejabbers
Friday, November 16, 2012
Static Code Contract analysis in VS 2012 Pro!!
As a footnote, I am not a fan of the strategy of the One OS To Rule Them All approach. I am not thrilled with my Windows 8 laptop setup. The UI between PC and tablet should be similar but fundamentally different. Unless my laptop has a touch screen, I want a keyboard/mouse centric UI with a conventional start menu. As a developer, I want to go further, and do as much as I can with a keyboard, without even having to reach to the mouse. Working in an office, I certainly can't use speech recognition either. It is important for Microsoft to allow business units that focus on different devices to innovate on their own. Long live skunkworks! After ScottGu's nice response, maybe I should pass these thoughts along to Steven Sinofsky as well ... Oh too late, nice knowing you Steven.
Tuesday, September 13, 2011
Passing objects of anonymous types as parameters to method
void Main()
{
dynamic foo;
var bar = new { name = "John", title="code monkey"};
var bar2 = new { name = "John", title="junior code monkey"};
foo = bar;
WriteIt( foo);
WriteIt(bar2);
}
static void WriteIt( dynamic dynamo)
{
Console.WriteLine(dynamo.title);
}
One important gotcha with using dynamic typing is that extension methods aren't currently available. That is a real shame since these two techniques would go together naturally for more ruby-esque development. But at some point, if you really want or need to program with dynamic techniques in .NET you are better of with IronRuby or IronPython. Sadly, Microsoft's interest in the DLR appears to have waned after successfully making programming for Microsoft Office easier in .NET. You can perform a cast though, but only if your type is not anonymous.
Tuesday, August 23, 2011
Compiler As a Service (CAAS) for VB.NET
Microsoft has been promoting a futures feature of its .NET compilers called Compiler As A Service(CAAS). CAAS will allow direct access to the functionality of the compiler. From a Microsoft job posting described here CAAS allows “REPL, write-your-own refactorings, C# and VB as scripting languages, and taking .NET to the cloud”. I think VB.NET, in particular, stands to gain with the potential to have C# code be convertible to VB.NET.
“Tightening the feedback loop” has been my mantra over the past year. In just one sense of this phrase, REPLs (read evaluate print loop) are a way to get immediate feedback on your code. I have been using, and loving, LinqPad a lot lately. It is far more than way to perform Linq queries. It is a very well thought out code snippet compiler. It isn’t truly a REPL, code is compiled with the benefit of behind the scenes code that LinqPad provides to make a complete class library. Also, unlike a REPL, once code is run, its variables aren’t treated as globally available for continued use. Essentially it is read-evaluate-print without the loop. Linqpad does succeed in providing much quicker feedback. Besides LinqPad, PowerShell and the Firebug for Firefox Javascript commandline are things I use frequently.
Aspect-oriented programming with tools like PostSharp could be greatly enhanced by CAAS. Postsharp works as a post-compilation code weaving. I think it might be significantly easier to weave code with the compiler functionality opened up. The job posting suggests that Refactoring processes could benefit on the other direction as a mini-compilation step done in the background to assist in changing the codebase. I wouldn’t want to speculate how though at this point.
As a LinqPad user in VB.NET you are a second class citizen since Intellisense is only provided for C#. Joe Albahari articulates on stackoverflow that CAAS would allow him to more easily provide VB.NET intellisense.
Putting a VB.NET specific spin on CAAS, is the potential ability to seamlessly convert between VB.NET and C#. One of the obstacles facing VB.Net is the necessity of converting code snippets available only in C#. For example, trying to convert a C# Linq statement to VB.NET fails utterly using Telerik’s converter, http://converter.telerik.com/. This will help a real pain point in using VB.NET.
Friday, August 19, 2011
Nesting Depth Metric
Friday, August 12, 2011
NCover and working with Code Coverage tools
I have been reluctant to embrace TDD philosophy in my development. I have spent the last year with a philosophy of adding unit test after coding done with DI techniques. The tests were aimed at the “low hanging fruit” scenarios. These included lower level classes with fewer dependencies and logic centric classes. Dependencies like communications, file I/O, and timing based events were classes that tended to be ignored. Not surprisingly my defects were focused in areas not covered by tests. I did employ a practice of writing at least one unit test for each defect.
Recently, after getting encouragement from management to spend time improving unit tests (in part due to schedule slack), I spent some time working with NCover, one of the leading .NET code coverage tools.
NCover has been simply excellent to work with. I have used NCover 3 complete which, at $479, is pretty pricey. NCover 3 classic is reasonably priced at $199. The features provided by the classic level, suit my current usage fine. The quickstart documentation is emphasized by the UI presentation for new projects. The focused ribbon-based UI made it easy to navigate. I found it to be a nice workflow to move between production code viewed in NCover and test code viewed in Visual Studio. For me, I tend to open many documents in Visual Studio at once, and not having to bounce between test and production code files in Visual Studio helped me be more productive.
I spent some time later working with Visual Studio 2010 Code Coverage (available with Premium or Ultimate editions). I think it important to mention my development environment might well affect my perspective. My machine is modern but lacks a solid state hard drive. Visual Studio does crash or hang more than I would like. Visual Studio is also kinda slow and uses lots of virtual memory. I have never met an Add-In that I didn’t want so, so this could well be affecting my experience. These factors collectively encourage me to favor non-integrated tools. More importantly, my single monitor and 1024x768 display really limits my desire to minimize the number and size of visual studio windows open.
Having given these caveats, I strongly preferred my experience with NCover. First, my above issues with Visual Studio make it feel cramped as it is. Second, debugging MSTest unit tests and obtained code coverage are mutually exclusive. I can have on or other turned on at any given time. Hitting Control-R Control-A to debug into unit test is where my sweet spot for working with tests is. I really don’t like the idea of having to switch settings regularly. Third, NCover’s windows and icons are tightly focused on its core tasks. This has helped make learning NCover really easy. In contrast, Visual Studio leverages its existing code editor, status windows, menus, and icons. Using Visual Studio Ultimate, the code coverage-related options are scattered amidst a plethora of features. Fourth , VS requires instrumenting assemblies whereas NCover doesn’t. Since my assemblies are usually signed, this means the assemblies must be modified and then re-signed for each coverage run.
Wednesday, August 10, 2011
Here is a LINQ to a zip file with my presentation and all sample LinqPad files.
Here are a set of LINQs covered in the talk.
- Test what you have learned about LINQ by taking the LINQ quiz!
- LinqPad, a wonderful tool used heavily in this talk.
- A really lucid explanation of grouping in LINQ
- An explanation of common gotcha with Linq, and really .NET in general, is the need to implement GetHashCode to deal with an equality check . This is a good example of the pain that you can go through, but take this lesson: implement GetHashCode!
- This is a nice explanation of how to deal with the possibility of an IEnumerable that contains no elements. Stack Overflow has many great LINQ questions, often answered by Jon Skeet. An excerpt from this post is "Use First when you know that there is one or more items in the collection. Use Single when you know that there is exactly one item in the collection. If you don't know those things, then don't use those methods. Use methods that do something else, like FirstOrDefault(), SingleOrDefault() and so on."
- An excellent book, Linq In Action has code samples that plug in directly to Linqpad.
- Hooked on Linq offers a brief introduction to LINQ. The author also has written a book Linq to Objects Using C# 4.0.
- Microsoft has published 101 LINQ examples.
- You can't research LINQ much without running into Jon Skeet's work. He has a new training series on C# at Tekpubthat takes a gentler approach. Beyond that, if you really want a deep understanding of C# features and LINQ. This material, particularly his blog and presentations, can be rather challenging. If you are up for the challenge, I have found it to be very rewarding.
- The Interactive LINQ project from Microsoft is still in an early stage but is worth checking out. Here is an overview of the project.
- LINQ can be considered a pull technology like most database access approaches. Microsoft has provided a set of Reactive Extensions that provide push technology, kind of like SQL Server Query Notifications.
- If you'd like to read more about LINQ to WMI, here is my codeproject article.
- Here is a helpful presentation on closures by Stuart Langridge. It is in Javascript, but the sample concepts apply.
- Linq perspectives from Bart De Smet.
Thursday, June 16, 2011
Clean Coding in Conshohocken
This has inspired me to read his new book, Clean Coder. I really enjoyed reading the book. It is very much anecdotal in nature and he has great stories to tell. Having taken a course from Juval Lowy recently where he focused on the successes he had and how to generalize that and allow others to be successful, in contrast Bob’s stories are funny, self-deprecating, and even courageous in the level of self revelation. I would never have the courage to tell his story about a meeting regarding project estimation while drunk. He wasn't drinking on the job of course, just bringing work talk into an off hours social gathering. A generous measure of failure is important, especially when a perspective comes from someone like Uncle Bob, who is essentially a preacher for his perspective. Preachers without this perspective can otherwise come across to me as strident or humorless. Readers who don’t expect that Uncle Bob will offer sermons on agile and TDD perspectives may be disappointed or even hostile. His views, particularly in this book, represent ideals. Striving for 100% code coverage is definitely an extreme view.
He has a really intriguing quiz early on in the some more obscure topics in computer science. His questions are
- Do you know what a Nassi-Schneiderman chart is?
- Do you know the difference between a Mealy and a Moore state machine?
- Could you write a quicksort without looking it up?
- Do you know what the term “Transform Analysis” means?
Answer: I am guessing that this refers to Fourier Analysis. Despite learning about this in grad school, all I can say is that this is a way to solve problems whose answers can be found through analysis of differential equations. - Could you perform a functional decomposition with Data Flow Diagrams?
Yuck, I have done this all too often in my first professional project. I have found it to not be easily applicable to OO design. Cross-cutting concerns like logging or communication ending up being elements in far too many different diagrams. - What does the term “Tramp Data” mean?
Answer: A term describing data which is passed to a function only to be passed on to another function. - Have you heard the term “Connascence”?
Answer: Two software components are connascent if a change in one would require the other to be modified in order to maintain the overall correctness of the system. Connascence is a way to characterize and reason about certain types of complexity in software systems. I really like this term, I hope it sticks in my brain. - What is a Parnas Table?
Answer: Tabular documentation of function values. It inspired the creation of FitNesse.It is just one of many contributions made by the brilliant David Parnas. His collected papers can be purchased here at Amazon.
Although the mistake isn’t significant, it merely reflects my pedantic nature, age and level of geekdom to be bothered by the fact that he misquoted Yoda. It is “do or do not. There is no try”, not “Do or do not. There is no trying”. To be fair this does come across as something an editor might have corrected. For the best Yoda reference see, Udi Dahan’s videocast, http://www.infoq.com/presentations/Making-Roles-Explicit-Udi-Dahan. Although a misquote, Bob’s general point about language usage and estimation is one of my favorites in the book. As a developer, agreeing to try to meet a deadline is agreeing to the schedule. I have been in situations where other developers have been caught by trying to agree to something that deep down they realize is unattainable. It is much better to be upfront with management early and not yield to the try request. Bob emphasizes the value in analyzing hedge phrases like try, do my best, see what I can do on the developer side and similar words on the management side. He mentions several that I need to be more alert for as signs of non-commitment, including saying “we need to … “ or “let’s …”.
I found the coverage of individual estimation vs group estimation interesting. The ideas behind PERT estimation are familiar to me and being individually responsible for estimation is something I have done often. What I haven’t done, or learned much about, is techniques for group-based estimation. Bob talks about low-ceremony versions of using a Wideband Delphi process like flying fingers or planning poker. His discussion of a technique called Affinity Estimation was particularly intriguing. In Affinity Estimation a group of tasks are written down on cards. A group of people, without talking, must order the tasks. Tasks which keep moving in a volatile manner are set aside for discussion. Once the ordering of the tasks becomes stable, discussion can begin. I like the idea that the risk of Groupthink is lessened by preventing vocal communication for a time.