Development

Manage jQuery in IE8 with bower

Here’s a quick way of using up to date jQuery whilst still supporting IE8.

In ASP.NET MVC Razor

    <!--[if IE 8 ]>
    @Scripts.Render("~/bower_components/jquery-legacy/dist/jquery.min.js")
    <![endif]-->
    <!--[if (gt IE 9)|!(IE)]>
    @Scripts.Render("~/bower_components/jquery-modern/dist/jquery.min.js")
    <![endif]-->

In your bower.json

    "dependencies": {
      "bootstrap": "~3.3.5",
      ...,
      "jquery-legacy": "jquery#^1",
      "jquery-modern": "jquery#^2"
    }
Advertisements

Why should I use Bower?

What is Bower?

Bower is a package manager for front-end frameworks.

Why should I use Bower?

It’s arguably simpler to use Bower as your package manager for front-end development than to use NuGet.

  • It can create a bower.json file that lists all of your dependencies.
  • It’s easily driven from the command line.
  • It separates out the concerns of the front-end from those of the back-end.

Of great benefit is it’s a single point of access to thousands of front-end packages that just aren’t available as NuGet packages. Previously you would have had to download and install the Javascript and the CSS. The process would have varied from project to project and there was no obvious upgrade path to the latest versions.

Starting with ASP.NET 5.0 there will be no more NuGet packages for front-end frameworks like Twitter Bootstrap. Microsoft currently create these packages using Bower but will now be expecting you to tool up with Bower yourself. Might as well get started early.

How do I use Bower?

This gets a bit gnarly, mainly because we’re behind a corporate proxy that expects authentication and tools like npm and bower aren’t clued up about that. You need to set up a proxy server locally to handle NTLM authentication. I’ve been successful with CNTLM.

  • set up CNTLM
  • install node (.msi)
  • set up .npmrc
  • $ npm install -g npm
  • ensure %appdata%\npm is before %ProgramFiles%\nodejs in the PATH
    (npm wiki: upgrading-on-windows)
  • $ npm install -g bower
  • set up .bowerrc
  • $ bower install bootstrap

You can now reference your bower_components directly from your includes or you can configure a Gulp/Grunt build.

Where do I find out more?

Bower
Introducing Gulp, Grunt, Bower, and npm support for Visual Studio

Subversion repository mirror

Creating a mirror of your Subversion repository means you have an up to date copy of your repository in a safe place. This guide shows how to set up a mirror and configure it for synchronisation. The instructions are for Windows.

Setting up the mirror uses svnsync which can be very slow over the network. It’s better to take a dump of the repository and load that into the mirror before starting the sync.

Take a backup

On the central repository:
$ cd c:\temp
$ svnadmin dump c:\subversion\my_repository --incremental -q > my-repository.dump
Create a 7zip archive of the dump file and delete the dump.
Copy the archived dump file to the repository mirror machine.

Create the mirror

On the mirror’s machine:
$ mkdir c:\repos && cd c:\repos
$ svnadmin create my-mirror

Load the dump file.
$ svnadmin load my-mirror -q < c:\temp\my-repository.dump

Serve up.
$ svnserve -d -r c:\repos\my-mirror
$ svn info svn://localhost/

Allow write access

conf/svnserve.conf
anon-access = none
auth-access = write
password-db = passwd.txt
authz-db = authz.txt

conf/passwd.txt
[users]
repobot = Robot1

conf/authz.txt
[/]
repobot = rw

Note: It can be easier to set up with anon-access = write and once it’s all working then upgrade the security.

Set up a pre-revision hook

[http://www.cardinalpath.com/how-to-use-svnsync-to-create-a-mirror-backup-of-your-subversion-repository/ > Step 3.]
$ cp C:\repos\my-mirror\hooks\pre-revprop-change.tmpl pre-revprop-change.bat
@ECHO OFF
set repository=%1
set revision=%2
set userName=%3
set propertyName=%4
set action=%5

:: Allow editing of revision properties for user.
if "%userName%" == "repobot" goto :eof else goto :ERROR_USER

:ERROR_USER
echo Only admin user may change revision properties. >&2
goto ERROR_EXIT

:ERROR_EXIT
exit 1

Set up properties to allow svnsync with the pre-populated mirror

$ svn info https://repo-box:8443/svn/my_repository

Use the UUID of the remote:
$ svn propset --revprop -r0 svn:sync-from-uuid 5aab9995-1597-de4e-ac02-ede68baf940a svn://localhost/

Use the last merged revision number:
$ svn propset --revprop -r0 svn:sync-last-merged-rev 14175 svn://localhost/

Set to sync from url:
$ svn propset --revprop -r0 svn:sync-from-url https://repo-box:8443/svn/my_repository svn://localhost/

Initial Sync

$ svnsync synchronize svn://localhost/

Auto-sync

repo-box:
hooks/post-commit.bat
svnsync --non-interactive --steal-lock --sync-username repobot --sync-password Robot1 --source-username repobot --source-password Robot1 sync svn://mirror-box/ svn://repo-box/

$ sc create SvnServe binpath= "\"%programfiles%\CollabNet\Subversion Client\svnserve.exe\" --service -r c:\subversion\my_repository" depend= Tcpip start= auto
$ net start svnserve

Create Windows Service for svnserve

sc delete SvnServe
sc create SvnServe binPath= "%programfiles%\TortoiseSVN\bin\svnserve.exe --service -r c:\repos\my-mirror" depend= Tcpip start= auto
net start SvnServe

Bulk Insert and Entity Framework

Entity Framework (EF), as of v. 6, still has no method for dealing with bulk inserts. If you have, for instance, 750,000 records that need inserting then you will end up sending 750,000 insert statements to the database. In this instance you may also find yourself creating an enourmous object graph, too, so your application may not even get to the point of performing the required inserts without running out of memory.

Here’s a simple workaround. Take a copy of the part of your object model that’s going to cause the problem and then null it out. Perform the rest of the database insert using EF 6 and grab whatever identities you may need for the remaining objects’ foreign keys. Then perform a Bulk SQL Insert of the remaining data.

Here’s a cut-down entity relationship diagram for an example object hierarchy, in this case an investment fund that is a wrap product for further funds (a fund of funds). In the wrap there are multiple years in the forecast and in each year the percentage of the total sum invested varies across the funds. A single configuration contains multiples of these wraps. What’s important here is the hierarchy can grow to have a great many percentages in the final table and thus a great many inserts.

Entity Relationship diagram

The solution I’ve used relies on getting the DbContext from EF and using the underlying database connection to create Commands (plain old SQL) for the intermediary tables (Fund and Year). It then creates a DataTable for the final table, Percent, and uses SqlBulkCopy() with the context’s db connection and the underlying transaction (nice). The whole thing is wrapped in a try catch block that is within the context’s db transaction so any exceptions can cause a rollback.

You can follow along this quite simple workaround in this demo code. Note the use AutoDetectChangesEnabled = false and ValidateOnSaveEnabled = false both of which will improve performance without any other issues.

Long running processes and RESTful APIs.

I have to create a RESTful API that can accept a request for a resource that may take some time to find as it’s the result of a complex deterministic model. The ASP.NET Web API is hosted on an Azure Web Role and exposes a RESTful interface to the clients. The clients are not web browsers, in this case, but other applications.

On the initial GET or “fetch” of a particular resource the very long querystring that contains all of the data the model run requires is handled with a 202 Accepted. Then a 304 Not Modified is returned on each subsequent request for the same resource until finally we have the new resource in our server’s cache, placed there by a background process that monitors a queue that in turn is fed by the model engine. Finally we will return the new resource with a 200 OK. As the result already exists, in a way, and we are just having to take some time to fetch it for the client, then the cache is not being updated with a result but rather it is caching the resource from the application.

In REST the resource has an id. In this case the id is the URI or the ETag (the Entity Tag being the hash of the URI). This is not a “process”, like a POST or “insert” of a resource, nothing in the data is being changed, it’s just a very slow request.

Request Response Ping Ping

The initial request immediately returns a 202 Accepted. This is better than holding the connection open for up to the 4 minutes allowed by the Azure Load Balancer as that would be expensive and we would run the risk of overloading our application.

The 202 response carries with it an ETag (Entity Tag) which is the token that the client can use to make another request to the application. An ETag represents a resource and is intended to be used for caching. We are caching our resource and it’s current state is empty.

The client will then present the ETag in the If-None-Match header value. This is the specified way to check for any changes to a resource. If the state of the resource has not changed then the application will return a 304 Not Modified. If the resource has changed, which in practice means the application has completed its run and dumped the result in the cache, then the application will return the current state of the resource. The ETag is also returned, should the client wish to request the resource again, and a 200 OK to indicate the end of the request.

Note that the above is only example code that has been stripped of some conditional checks and guard clauses for readability.

Update npm to v.2.0.x on Windows 7

The node package manager, npm, is now at v.2.0.2 but the update path isn’t obvious on Windows 7.

  1. remove Node.js from Programs and Features
  2. install the latest Node.js (use the .msi for x86 or x64)
  3. npm install npm -g
  4. delete ‘npm’, ‘npm.cmd’ files from “C:\Program Files\nodejs”

You have to delete the version of npm in the nodejs folder or it will take precedence over the version installed in your %APPDATA%.

Now you can check you have the latest version:

npm --version

If you need to update later the standard command should work:

npm update npm -g

YMMV

CNTLM

CNTLM is an authentication proxy that can solve problems with applications trying to get through a Windows proxy that requires authentication. I need this to clone from GitHub repos when behind a workplace proxy server. Getting it installed, however, may involve a little fancy footwork.

  1. Rename the .exe to something else as some policies won’t allow it to be installed. How about mostly_harmless.exe?
  2. Use the command line options -H -u -d to get the hashes for your config file: mostly_harmless.ini
  3. Launch the application: mostly_harmless -c mostly_harmless.ini
  4. Edit your git.config: [http] proxy=http://127.0.0.1:3128

Hurrah!

Ninject Configurer

What’s a nice way to set up configuration for add-in libraries for your project? They can’t have app.config, that has to be centralised in your web.config. So wrap up the config you’re going to need into a nice class and then pass that in when you need it.

If you’re using Ninject as your IoC you’ll want it to manage the dependency for you. Hey, while it’s at it, couldn’t it just fetch out those config values, too?

That’s what Ninject.Configurer does. It’s available as nuget package but that’s for .NET 3.5 so you may need to compile your own. I add a ‘Configuration’ folder to the class library and then just need three classes.

Microsoft.Bcl generally considered harmful

I’ve found the best way to avoid the problems with the Microsoft BCL Portability Pack, which I had in an ASP.NET MVC 4 project, was to get rid of it.

Remove all System.Net.Http and System.Net.Http.* references then use NuGet to remove the BCL Portability Pack. Use NuGet to install an earlier version of System.Net.Http that doesn’t rely on BCL:

Install-Package Microsoft.Net.Http -Version 2.0.20710.0

Then update the web.config:

<dependentAssembly>
  <assemblyIdentity name="System.Net.Http" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
  <bindingRedirect oldVersion="0.0.0.0-1.0.0.0" newVersion="2.0.0.0"/>
</dependentAssembly>

Now any developer can open the solution *without* some silly M$ workaround.

NuGet and chocolatey behind a proxy

NuGet historically didn’t handle company proxies, a common enough issue, which has led to a lot of guidance on the ‘net that is now out of date on how to deal with a proxy that isn’t playing nicely with your default credentials. Here are two tips: the first will let you get chocolatey installed and the second will let you configure NuGet so that chocolatey can do its goodness.

install chocolatey with windows auth proxy

The above gist will let you install chocolatey from a PowerShell command line. I used the normal download URL like this:

Set-ExecutionPolicy -ExecutionPolicy Unrestricted;$a=new-object net.webclient;$a.proxy.credentials=[system.net.credentialcache]::defaultnetworkcredentials;$a.downloadstring('https://chocolatey.org/install.ps1')|iex

NuGet configuration settings

From the linked configuration settings above I added:

.\nuget config -Set http_proxy=company-proxy:8080
.\nuget config -Set http_proxy.user=mylogin
.\nuget config -Set http_proxy.password=secret

Note that the password is encrypted in your NuGet.config so you can’t edit it directly.

Now I can use chocolatey without seeing:

Please provide proxy credentials:
UserName: Cannot prompt for input in non-interactive mode.

I believe that PowerShell and NuGet are trying to present the default web credentials rather than the default network credentials and in our case the proxy we lurk behind isn’t satisfied with that, thus the need to be explicit. It’s possible that NuGet was failing to present any credentials; I could investigate with Fiddler but these fixes have solved it for me. YMMV.