Zen Mode
I've been away from my blog for a while (as you may have already noticed). I can say that it's partially due to a painfully slo-o-o-w experience running wordpress on [BlueHost][1]. I eventually decided to ditch both, build a simple blog and learn a thing or two while i'm at it. I'm well aware that most of the stuff built into this little project is a total overkill, but it was a great learning experience, so here is the full stack: + ASP.NET MVC 3 + [Dapper][2] - MicroORM for data access + [AutoMapper][3] - DTO to view model mappings + [Unity][4] - dependency injection + [Memcacher][5] - uhm, caching + [Twitter Bootstrap][6] - layout + [LESS][7] - css + [CoffeeScript][8] - javascript + [MarkdownSharp][9] - server-side markdown processing + [Marked.js][10] - client-side markdown processing (live preview) Best of all is that all of this goodness is hosted on [AppHarbor][11] for absolutely zilch! They're like Heroku for .NET trying to make .NET deployments **really** easy. Anyway, more posts to come. Thanks for reading! [1]: http://www.bluehost.com "Bluehost" [2]: http://code.google.com/p/dapper-dot-net/ "Dapper - a simple object mapper for .Net." [3]: http://www.automapper.org "A convention-based object-object mapper." [4]: http://unity.codeplex.com/ "Microsoft patterns & practices" [5]: https://github.com/enyim/EnyimMemcached/ "Enyim Memcached Client" [6]: http://twitter.github.com/bootstrap/ "Bootstrap,from Twitter" [7]: http://lesscss.org/ "The dynamic stylesheet language." [8]: http://coffeescript.org/ "CoffeeScript is a little language that compiles into JavaScript." [9]: http://code.google.com/p/markdownsharp/ "Open source C# implementation of Markdown processor, as featured on Stack Overflow." [10]: https://github.com/chjj/marked/ "A markdown parser and compiler. Built for speed." [11]: https://appharbor.com/ "Where .NET apps grow and prosper."
I’ve been doing quite a bit of work in JavaScript recently. It’s a major shift from regular server-side grind. I learned a few patterns and became more aware of different architectural practices when writing front-end code. I am starting to slowly get over my love/hate relationship with all things client-side. I absolutely love the responsiveness and interactivity of a modern user interface. Yet, I despise writing JavaScript code. To me it’s the mind-boggling dynamic nature of JavaScript that makes it so complicated. So, what I’d like to do is start quick series about different JavaScript tips and tricks I learned over time, which essentially make writing JavaScript a slightly better experience (at least in my opinion). Like many others, I write all of my JavaScript in jQuery. I’ve already talked about [how to structure jQuery code][1], today I’d like to discuss a cool plugin pattern i picked up from [Twitter Bootstrap][2] while using it on one of my projects. Here is the skeleton. (function () { /* Plugin class definition */ var Plugin, privateMethod; Plugin = (function () { /* Plugin constructor */ function Plugin(element, options) { this.settings = $.extend({}, $.fn.plugin.defaults, options); this.$element = $(element); /* Do some initialization */ } /* Public method */ Plugin.prototype.doSomething = function () { /* Method body here */ }; return Plugin; })(); /* Private method */ privateMethod = function () { /* Method body here */ }; /* Plugin definition */ $.fn.plugin = function (options) { var instance; instance = this.data('plugin'); if (!instance) { return this.each(function () { return $(this).data('plugin', new Plugin(this, options)); }); } if (options === true) return instance; if ($.type(options) === 'string') instance[options](); return this; }; $.fn.plugin.defaults = { property1: 'value', property2: 'value' }; /* Apply plugin automatically to any element with data-plugin */ $(function () { return new Plugin($('[data-plugin]')); }); }).call(this); Calling the above plugin works like so: `$('selector').plugin();` Starting at the top we see that all of our code is inside a self-executing anonymous function. This is a pretty standard pattern in JavaScript used to isolate code into “blocks.” Next, we get to the interesting part, which sets this aside from other patterns. All of the plugin logic resides in the Plugin object. This allows you to use prototypal inheritance of JavaScript to extend plugins when necessary. Things get a little tricky in our jQuery plugin definition. What we’re attempting to do here is store the Plugin object inside the data attribute of each element resolved through the selector. So here is what’s happening inside: 1. If the selector in calling code resolves to single element we’ll try to pull the instance of the Plugin object from that element’s data attribute and return it. 2. If nothing is found, assume we’re working with a collection of elements and try to iterate through it while configuring (using constructor) and storing the instance of Plugin object in the data attribute. 3. If the calling code passes ‘true’ to the plugin, we’ll attempt to return current instance of the Plugin object. This will only work with selectors resolving to single element. 4. If the calling code passes a string to the plugin, we’ll assume it’s the name of a method of our plugin class and attempt to execute it. Last but not least, we’ll try to automatically apply the plugin to any element marked with its specific data attribute. If you’re using CoffeeScript, here is the code that generates the above skeleton. ### Plugin class definition ### class Plugin ### Plugin constructor ### constructor: (element, options) -> this.settings = $.extend({}, $.fn.plugin.defaults, options) this.$element = $(element) ### Do some initialization ### ### Public method ### doSomething: () -> ### Method body here ### ### Private method ### privateMethod = () -> ### Method body here ### ### Plugin definition ### $.fn.plugin = (options) -> instance = this.data('plugin') if not instance return this.each -> $(this).data('plugin', new Plugin this, options) return instance if options is true instance[options]() if $.type(options) is 'string' return this $.fn.plugin.defaults = property1: 'value' property2: 'value' ### Apply plugin automatically to any element with data-plugin ### $ -> new Plugin($('[data-plugin]')) If you have questions or improvements, share them in comments. Thanks for reading! [1]: http://www.sergeyakopov.com/2011/04/15/organizing-javascript-for-minification-maintenance-and-performance/ "Organizing JavaScript for minification, maintenance and performance" [2]: http://twitter.github.com/bootstrap/ "Twitter Bootstrap"
My company has gone through 2 revision control systems. We’ve been through Visual SourceSafe, which, unfortunately, is still used for legacy projects. And we also had pretty kick-ass time with Subversion. I personally absolutely love Subversion, but you can’t disregard the awesomeness of the Team Foundation Server (TFS). Right out of the box you get source repository, document storage, reporting, bug tracking and development methodology tools. I digress. The goal of this post is to outline requirements that have to be met before installation and issues you may run across while installing TFS and its prerequisites. ## Organizing your tiers Before proceeding with installation you need to figure out how you want to organize your tiers. My installation utilizes 2 servers. The data-tier server is only running SQL Server 2008 R2. The application-tier server houses everything else such as TFS services, Analysis Services, SQL Server Reporting Services (SSRS) and SharePoint Services (WSS). This setup really depends on the size of your team. If you have a large distributed team that is growing than perhaps you want to consider a different setup that would allow for greater scalability as your company grows. If you have a really small team, than you could combine both tiers on the same server. ## Preparing user accounts It is VERY important not to screw this up. TFS installation requires at least 2 domain accounts (labeled like so), however i will be discussing all standard accounts used. TFSsetup account is used during the installation, repair & servicing (applying patches and hotfixes). This account has to have local admin rights on the TFS server and be a “sys admin” in SQL while performing these tasks. There is another very crucial step. TFS installation uses Windows Management Instrumentation (WMI) interface to query remote servers in order to validate that a certain service or component is installed and running. This translates to one thing – **TFSsetup** user must have administrator rights on all servers involved in the installation, or you will likely see a bunch of permission errors and warning while it’s attempting to use WMI against a remote server. Having said that, if you plan to run your SQL Server on another box like i do, make sure your setup account had admin rights on it. Same applies to SSRS and Analysis Services. TFSservice account, as the name suggests, is used to run the TFS services. This account is responsible for running several of the TFS jobs at the back end and the account used to access SQL databases. This account will need “DBcreator” and “Security admin” roles in SQL. In addition to that, this account must be added to “Log on as a service” security policy on the TFS server. **TFSreports** account is used as a data reader account for SSRS. This account must be added to “Allow log on locally” security policy or you will have problems executing reports. Optionally, you could use TFSservice account for this. **WSSservice** account is used to run SharePoint services. Of course, this is only required if you plan to integrate with SharePoint. I simply use TFSservice account for SharePoint. ## Installation There isn’t a lot to note here. The installation process is very straight-forward, but the most crucial part is to carefully read instructions on each step and specify the right user accounts. I found it easier to install all prerequisites prior to running TFS setup. Just as a rule of thumb, make sure your server is up to date with the latest and greatest prior to running TFS setup. Also, verify that [TFS Service Pack 2][1] is installed on the server. Finally, don’t forget to run the entire install as TFSsetup user. ### SSRS/Analysis Services If you’re installing Analysis Services and SSRS, make sure to use **TFSservice** as your “Service Account.” The installation defaults to Network Service or Local Service if you don’t specify otherwise. These will not work! Using a domain account as Service Account in SSRS actually caused issues with TFS not being able to setup reports for Team Projects. I also had authentication issues (only in IE) while trying to view reports. Changing back to built-in “Network Service” account resolved both problems. Make sure to restart your service as soon as you change your account settings. ### SharePoint TFS setup gives you the option to install WSS for you. However, if you’re [installing SharePoint][2] manually as i did, then remember to select “Web Front-end” installation type in the wizard and use _17012_ for port number when asked. I also had a strange problem running SharePoint setup where i was getting _“This package failed to run”_ exception. After pulling the 3 hairs I have left on my head, I learned that the cure for this problem is to extract the contents of the setup package by running the following command _C:\Path\To\SharePoint.exe /extract:C:\Path\To\Some\Extract\Folder_ and then run Setup.exe in the destination folder you specified during extract process. That’s it! Happy installing! [1]: http://www.microsoft.com/download/en/details.aspx?id=20506 "Download: Microsoft® Visual Studio Team Foundation Server® 2010, Service Pack 1" [2]: http://www.microsoft.com/download/en/details.aspx?id=7006 "Download: Windows SharePoint Services 3.0 with Service Pack 2"
Just a few days ago I and a colleague of mine were attempting to do just this on a production site, which collects sensitive information from users. This site used to run with SSL required globally. We wanted to allocate it only to these certain parts while understanding that the Business/UI logic shouldn’t care about the protocol it runs on. After some research, we landed on 2 solutions. 1. Attribute-based approach where marking an ASP.NET page with a custom attribute would automatically redirect to HTTPS when the page is accessed over HTTP and vise-versa when going back from secure to unsecure areas. 2. Configuration-based approach where secure sections of the site are defined using virtual paths & regular expressions. The good thing is that both solutions are already out there. The [first approach is detailed on CodeProject][1]. It involves a custom attribute named `RequireSSL` and an `HttpModule` which handles redirects depending on existence of absence of the attribute on page handlers. Simply slap `RequireSSL` on your page class you want to protect and you’re done! The [second approach][2] does the same thing by configuring protected paths on the site. This method does not require code changes as all configurations sit in the web configuration file. This is great for larger applications because you can protect an entire directory with a single configuration definition or if you want to go more granular you can specify a single resource or a collection of resources matching a regular expression criteria. We went with the first approach mainly because it was much easier to setup without all 3rd party tooling and configurations and we thought it would be easier to reuse it and integrate it into our internal framework. [1]: http://www.codeproject.com/KB/web-security/http_https.aspx "Switching Between HTTP and HTTPS Like A Bigshot Hotshot" [2]: http://code.google.com/p/securityswitch/wiki/GettingStarted ".NET libraries for automatically switching between HTTP and HTTPS protocols."
I recently looked into HTML minification in ASP.NET. The first thing I thought of was to use an `HttpModule` to somehow remove white spaces. Not so great as it would execute at run-time and could impact performance. Then I turned to Google for some answers and found out about a really neat feature made public in ASP.NET 2.0, which makes this kind of thing incredibly easy and seemless. This is one of those .NET gems you likely never heard about – the [PageParserFilter][1] class. As it turns out, `PageParserFilter` allows you to hook into page parsing at compile time. This class provides the control tree of your .aspx page (including server-side and client-side markup) and allows you to alter it using an instance of [ControlBuilder][2]. Not sure about you, but I already jizzed my pants. Well, anyway, this would be the ultimate place to do HTML minification magic I was wanting to do. In fact, it’s [already been done here][3] by Omari Omarov and works beautifully (download his sample application to see how this is used). I spent some time analyzing Omari’s code and had another useful idea, which i decided to turn into a simple proof-of-concept code for this post. I decided to use the same method to set cache breakers on my JavaScript and stylesheet includes. If you’re not familiar with cache breakers, all I’m talking about is the version number that you append to the end of your css/js includes to break cache dependency in browsers (i.e. master.js?v=12345). So I mocked up a quick prototype based on Omari’s code just to show how easy this could be done. using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.UI; using System.Collections; using System.Reflection; using System.Web.UI.HtmlControls; namespace Page.Parsing.Voodoo { public class CacheBreakerPageParserFilter : PageParserFilter { const BindingFlags InstPubNonpub = BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance; public override bool AllowCode { get { return true; } } public override int NumberOfControlsAllowed { get { return -1; } } public override int NumberOfDirectDependenciesAllowed { get { return -1; } } public override int TotalNumberOfDependenciesAllowed { get { return -1; } } public override bool AllowBaseType(Type baseType) { return true; } public override bool AllowControl(Type controlType, ControlBuilder builder) { return true; } public override bool AllowServerSideInclude(string includeVirtualPath) { return true; } public override bool AllowVirtualReference(string referenceVirtualPath, VirtualReferenceType referenceType) { return true; } public override CompilationMode GetCompilationMode(CompilationMode current) { return base.GetCompilationMode(current); } public override Type GetNoCompileUserControlType() { return base.GetNoCompileUserControlType(); } public override bool ProcessCodeConstruct(CodeConstructType codeType, string code) { return base.ProcessCodeConstruct(codeType, code); } public override bool ProcessDataBindingAttribute(string controlId, string name, string value) { return base.ProcessDataBindingAttribute(controlId, name, value); } public override bool ProcessEventHookup(string controlId, string eventName, string handlerName) { return base.ProcessEventHookup(controlId, eventName, handlerName); } protected override void Initialize() { base.Initialize(); } public override void ParseComplete(ControlBuilder rootBuilder) { SetCacheBreakerOnNestedBuilder(rootBuilder); base.ParseComplete(rootBuilder); } private static void SetCacheBreakerOnNestedBuilder(ControlBuilder controlBuilder) { ArrayList nestedBuilders = GetSubBuilders(controlBuilder); for (int i = 0; i < nestedBuilders.Count; i++) { string literal = nestedBuilders[i] as string; if (string.IsNullOrEmpty(literal)) continue; nestedBuilders[i] = AppendCacheBreaker(literal); } if (controlBuilder.ControlType == typeof(HtmlLink)) { foreach (SimplePropertyEntry entry in GetSimplePropertyEntries(controlBuilder)) { entry.Value = AppendCacheBreaker(entry.PersistedValue); } } else { foreach (object nestedBuilder in nestedBuilders) { if (nestedBuilder is ControlBuilder) { SetCacheBreakerOnNestedBuilder((ControlBuilder)nestedBuilder); } } foreach (TemplatePropertyEntry entry in GetTemplatePropertyEntries(controlBuilder)) { SetCacheBreakerOnNestedBuilder(entry.Builder); } foreach (ComplexPropertyEntry entry in GetComplexPropertyEntries(controlBuilder)) { SetCacheBreakerOnNestedBuilder(entry.Builder); } } ControlBuilder defaultPropertyBuilder = GetDefaultPropertyBuilder(controlBuilder); if (defaultPropertyBuilder != null) { SetCacheBreakerOnNestedBuilder(defaultPropertyBuilder); } } private static string AppendCacheBreaker(string literal) { if (literal.Contains(".css")) { literal = literal.Replace(".css", ".css?v=1234567890"); } if (literal.Contains(".js")) { literal = literal.Replace(".js", ".js?v=1234567890"); } return literal; } private static ArrayList GetSubBuilders(ControlBuilder controlBuilder) { if (controlBuilder == null) throw new ArgumentNullException("controlBuilder"); return (ArrayList) controlBuilder .GetType() .GetProperty("SubBuilders", InstPubNonpub) .GetValue(controlBuilder, null); } private static ControlBuilder GetDefaultPropertyBuilder(ControlBuilder controlBuilder) { if (controlBuilder == null) throw new ArgumentNullException("controlBuilder"); PropertyInfo pi = null; Type type = controlBuilder.GetType(); while (type != null && null == (pi = type.GetProperty("DefaultPropertyBuilder", InstPubNonpub))) { type = type.BaseType; } return (ControlBuilder)pi.GetValue(controlBuilder, null); } private static ArrayList GetTemplatePropertyEntries(ControlBuilder controlBuilder) { if (controlBuilder == null) throw new ArgumentNullException("controlBuilder"); ICollection tpes = (ICollection) controlBuilder .GetType() .GetProperty("TemplatePropertyEntries", InstPubNonpub) .GetValue(controlBuilder, null); if (tpes == null || tpes.Count == 0) { return new ArrayList(0); } else { return (ArrayList)tpes; } } private static ArrayList GetComplexPropertyEntries(ControlBuilder controlBuilder) { if (controlBuilder == null) throw new ArgumentNullException("controlBuilder"); ICollection cpes = (ICollection) controlBuilder .GetType() .GetProperty("ComplexPropertyEntries", InstPubNonpub) .GetValue(controlBuilder, null); if (cpes == null || cpes.Count == 0) { return new ArrayList(0); } else { return (ArrayList)cpes; } } private static ArrayList GetSimplePropertyEntries(ControlBuilder controlBuilder) { if (controlBuilder == null) throw new ArgumentNullException("controlBuilder"); ICollection cpes = (ICollection) controlBuilder .GetType() .GetProperty("SimplePropertyEntries", InstPubNonpub) .GetValue(controlBuilder, null); if (cpes == null || cpes.Count == 0) { return new ArrayList(0); } else { return (ArrayList)cpes; } } } } To use this parser filter reference your parser filter type in configuration settings. First of all, a lot of code you see here was extracted from Omari’s framework. This is just to show the entire parser filter without any dependencies on his code. So a lot of the voodoo handled in Omari’s framework is thrown in this single class. I would actually suggest using his framework to do any work with parser filters because of the 2 reasons outlined below. 1. The `ControlBuilder` class hides a few properties which are of interest to us. So, we have to use reflection magic. Omari’s parser filter framework provides a useful list of extension methods that help with that. 2. By design, you can only register a single PageParserFilter. Once again, Omari gives us a configuration section to register and execute more than 1 if so necessary. Aside from overriding properties which you can read about on MSDN, let me get straight to the point. I do all of the parsing in `ParseComplete` override. This method is called upon when the page parser finished parsing all of the client and server-side markup. At this point, the entire page hierarchy of server-side controls and HTML elements is encapsulated in an instance of `ControlBuilder` passed to this method. This object is nested as it contains `ControlBuilder` for children elements within the page hierarchy. (Note, some elements such as `DOCTYPE` do not require a `ControlBuilder` and are represented as simple strings. In fact, it appears that all non-nested elements are parsed as strings.). We need to recursively loop through this object to step through the entire page hierarchy looking for external file references. I do this task in `SetCacheBreakerOnNestedBuilder` method. If I find an external JavaScript or stylesheet reference, I tack on the hard-coded _?v=1234567890_ at the end. As this is only a proof-of-concept, there is a lot of room for improvement. Version number could be retrieved from the executing assembly or the configuration file. Also, some script references may not need to be versioned (i.e. references from Google’s CDN), so we could add an attribute to certain script elements to exclude them from versioning and have our parser filter read and delete them afterward. There are probably a lot of more useful things you could do with this class. [Phil Haacked][4] also talked about a way to throw a compile-time exception if you want to restrict certain elements (i.e. server script blocks) from your MVC views. [1]: http://msdn.microsoft.com/en-us/library/system.web.ui.pageparserfilter.aspx [2]: http://msdn.microsoft.com/en-us/library/system.web.ui.controlbuilder.aspx [3]: http://omari-o.blogspot.com/2009/09/aspnet-white-space-cleaning-with-no.html [4]: http://haacked.com/archive/2009/05/05/page-view-lockdown.aspx