Node.JS npm module install on windows 8

Being a complete novice on NodeJs, i was hoping for all the npm modules to load correctly. Well. They did.. Almost. When it came to D3 and Ursa, the npm installation failed with various errors ranging from ‘Python’ to C++ and OpenSSl.

To all the people beginning node development on windows 8/7 try these prerequisites

This covers most, but based on my experience i added a few more.D3

  • Get visual studio 2013 express desktop with c++
  • node-gyp (npm install -g node-gyp --msvs_version=2013)
    • Python 2.7 (not 3.3)
    • Set the environment variable path ‘PYTHON’ to c:\python2
      • Windows XP/Vista/7:
      • Windows 8:
        • Microsoft Visual Studio C++ 2012 for Windows Desktop (Express version works well) (OR vs 2013)
  • OpenSSL (normal, not light) in the same bitness as your Node.js installation.
    • The build script looks for OpenSSL in the default install directory
      (C:\OpenSSL-Win32 or C:\OpenSSL-Win64)
    • If you get Error: The specified module could not be found., copy libeay32.dll from the OpenSSL bin directory to this module’s bin directory, or to Windows\System3.
    • I in fact installed both 64 as well as 32 bit
    • Copy the include folder from  C:\Python27 and paste it inside C:\Program Files (x86)\Microsoft Visual Studio 12.0\VC\include change 12 to 11\10 based on your vs version

 

This should help with the node modules which require c++ \python compilation

Hope this helps someone
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ‘

TFS 2013 “Some features of Team Web Access are not visible to you” and “Test” tab not available

I am the admin of the TFS server , i have a Vs 2013 Ultimate license and i am pretty sure i have installed TFS properly ( why ? Because everything else seems to be working) , but I am not able to find the much hyped TFS Web Test Manager , add to that , TFS tells me that “Some features of Team Web Access are not visible to you. In order to use all the features of Team Web Access, you must have the correct license and configuration ”

Image

Seems like TFS intentionally wants to play hide and seekAnyway the way out of these 2 irritating problems is actually quite simple ( and no ,its not rebooting the server)
  • Go to the setting page ( The Image button at the top right corner)  and click on ‘Control Panel’ .
  • Navigate to the ‘Access Levels’ tab
  • The default access level is ‘Standard’ . Change it to ‘Full’

 Image

  • And voila, you get rid of that irritating message and also get back the ‘Test’ tab on  your tfs web access
  • Image
Hope this helps someone
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ‘

Error deleting VHD: There is currently a lease on the blob and no lease ID was specified in the request

While deleting a VHD from Azure Storage you might get an error which goes something like this “A lease conflict occurred with the blob ” etc etc

What Azure is trying to tell you in its own friendly way is that

  • Please delete the VM which might be still running
  • Delete the data Disk attached to the VM

The error occurs because  an existing VM is using a disk, or if you are attempting to delete a VHD and it is still registered in the portal as either a disk or image under Virtual MachinesDisks orImages.

Hope this helps someone.
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ’

Multiple Asp.Net forms authentication on same IIS

If you are reading this and thinking , why the heck is he writing about  something so utterly stupid, then my short answer is

“I learned it the hard way”.

The other day i was testing the scenario of having 2 application’s both running on Cassini and Asp.net Forms authentication. What i realized was that if both the app’s are running and   i am on Login page on App1 and Landing  page on app2( i have already logged into App2) then the forms authentication passes if i go directly to the Landing page on APP1 .After a few minutes of grumbling i found something similar here .

The solution, is to provide a Unique ASP auth cookie name for your application

<system.web>
  <authentication mode="Forms">
    <forms loginUrl="~/Admin/Account/SignIn" name=".ASPXAUTH_Blog" />
  </authentication>
  <!-- ... -->
</system.web>

 

Hope this helps someone
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ’

Continuous Integration- Deployment with Visual Studio + GitHub + Jenkins + MSbuild + MsDeploy

In my earlier posts , i wrote about a Continuous Integration- Continuous Deployment( here and here) in asp.net along with TFS and octopus deploy.This post tackles the same beast but in a different way .

Welcome to the world of free stuff!!

Visual Studio integration with GitHub

1) Let’s get our Visual Studio setup first with a asp.net web forms project

2) Add the solution to source control (GitHub). I used VS 2013  which has integrated support for Git, and in case you are not using vs2013, download the add-on  here. To setup the code follow the steps mentioned  here.

Once we have the github account ready and the demo code added to the repository , we are ready for a sample deployment.

at this stage we should be seeing

sc1                                                sc2

Jenkins Build using Ms Build

For setting up the  Jenkins  CI , i used an Azure VM  with windows 2008R2

1) Download Jenkins ( i used 1.551) and open localhost:8080

2)By default Jenkins portal  runs on port 8080 (thus if you are going vm route, make sure you  have the port open on firewall)

Jenkins Configuration

We need to configure Jenkins(see pic below) -> Manage plugins -> Available tab

sc3   from the available tab, choose ‘MsBuild . Git Plugin and Github plugin andinstall these plugins.We also need to download and install Git

Post installation, we need to configure the build and code repo plugin’s . Under Manage Jenkins- >Configure System

– Configure Git, and mention the path

sc4

– Configure Ms Build

sc5 

– Configure Git Plugin

sc6

– Configure Git Web Hook : This will create a hook to Github,and  allow us to be notified incase of changes in git

 ccv

Go back to the Home page and add a Jenkins Item (project)

– Mention the Item Name

– Choose ‘Build a free-style software project’

Configure Jenkins Item \ Project Level Configuration

bbbProvide the following Details :

– Project Name

Github Project :https://github.com///

– Source Code Management

Choose Git, and provide the Git repo url :https://github.com/YourGitHubAccount>/>.git

Under Credentials: click ‘add‘ and provide your GitHub credentials

– Repository browser : choose githubweb

URL: Provide the  github repo url :https://github.com/YourGitHubAccount>/>.git/

– Build Triggers : Choose Build When change is pushed to GitHub

– Build : provide the ms build verison and the name of the solution file

For the Deployment to happen on Build completion, we need to include the publish bbbsettings \ pubxml in the solution

then  provide the following build params  in the ‘Command Line Arguments’ input  “/p:DeployOnBuild=true  /p:VisualStudioVersion=12.0 /p:Password=<PasswordtoDeploymentBox>  /p:PublishProfile=”<YourPublishFile.pubxml  . In cas

e of Visual studio 2012 change the visual studio version to 11.0

Git Hub Web Hook

Nest is setting up the web hook in Github.

Navigate to Setings under your project in github, and click on ‘WebHooks & Servvice

adsfwe3r

Click on ‘ Configure Services’ ->Jenkins(GitHub Plugin)

ouplu

Hope this helps someone

Ext.Net DataView with Drag-Drop

Drag Drop looks cool. Accept it!

but a drag n drop with a Ext.net DataView was something i was not really prepared for.So after hours of head scratching..i came up with this

//target dataview

<ext:DataView Cls=”hospital-target images-view” ID=”dvDroppedImage” runat=”server” SimpleSelect=”true” SelectedIndex=”0″ OverItemCls=”x-item-over”
ItemSelector=”div.thumb-wrap” EmptyText=”No images to display”>
<Store>
<ext:Store ID=”strDropZone” runat=”server” AutoLoad=”false”>
<Model>
<ext:Model ID=”Model8″ runat=”server” IDProperty=”guid”>
<Fields>
<ext:ModelField Name=”Name” />
<ext:ModelField Name=”Url” />
<ext:ModelField Name=”Description” />
<ext:ModelField Name=”guid” />
</Fields>
</ext:Model>
</Model>
</ext:Store>
</Store>
<Tpl ID=”Tpl4″ runat=”server”>
<Html>

100px; width: 100px; border: 2px dotted silver; margin-left: 5px; float: left”>

 

Place a image here


</div>
<tpl for=”.”>
<div id=”Div1″>

<img src=”{Url}” title=”{Name}”>


<span>{Name}</span>
</div>
</tpl>
<div></div>
</Html>
</Tpl>
</ext:DataView>

//source dataview
Nothing special here…plain ol dataview

just place a dragzone and a drop zone on you page

<ext:DragZone ID=”DragZone1″ runat=”server” Target=”={#{ImageView}.getEl()}”>
<GetDragData Fn=”getDragData” />
<GetRepairXY Fn=”getRepairXY” />
</ext:DragZone>

<ext:DropZone ID=”DropZone1″ runat=”server” Target=”={#{dvDroppedImage}.getEl()}”>

<GetTargetFromEvent Fn=”getTargetFromEvent” />
<OnNodeEnter Fn=”onNodeEnter” />
<OnNodeOut Fn=”onNodeOut” />
<OnNodeOver Fn=”onNodeOver” />
<OnNodeDrop Handler=”X.NodeImageDrop(data);” />

</ext:DropZone>

The drag zone asks for the source,and the dropzone asks for the the target .The function’s mentioned are useful for hooking up your custom code

var getDragData = function (e) {
var view = App.ImageView,
sourceEl = e.getTarget(view.itemSelector, 10);

if (sourceEl) {
d = sourceEl.cloneNode(true);
d.id = Ext.id();

return (view.dragData = {
sourceEl: sourceEl,
repairXY: Ext.fly(sourceEl).getXY(),
ddel: d,
patientData: view.getRecord(sourceEl).data
});
}
};
var getRepairXY = function () {
return this.dragData.repairXY;
};

var getTargetFromEvent = function (e) {
return e.getTarget(“.hospital-target”);
};

//      On entry into a target node, highlight that node.
var onNodeEnter = function (target, dd, e, data) {
Ext.fly(target).addCls(“hospital-target-hover”);
};

//      On exit from a target node, unhighlight that node.
var onNodeOut = function (target, dd, e, data) {
Ext.fly(target).removeCls(“hospital-target-hover”);
};

//      While over a target node, return the default drop allowed class which
//      places a “tick” icon into the drag proxy.
var onNodeOver = function (target, dd, e, data) {
return Ext.dd.DropZone.prototype.dropAllowed;
};

The final piece is however important. The post drop event .I am handling that at the server side (personal choice)

<OnNodeDrop Handler=”X.NodeImageDrop(data);” />
the problem comes when you find out that the ‘data’ is a complex Json data and needs serious  deserializing
to do that i suggest this :
public class ImageData {
public string Name { get; set; }
public string Url { get; set; }
public string Description { get; set; }
public string guid { get; set; }

}
public class RootObject{
[JsonProperty(“sourceEl”)]
public object SourceEl { get; set; }

[JsonProperty(“repairXY”)]
public int[] RepairXY { get; set; }

[JsonProperty(“ddel”)]
public object Ddel { get; set; }

[JsonProperty(“patientData”)]
public ImageData PatientData { get; set; }

}
public void NodeImageDrop(string data)
{

RootObject ab = Newtonsoft.Json.JsonConvert.DeserializeObject(data);
List<ImageData> p = new List<ImageData>();
p.Add(ab.PatientData);
//this is to bind the data to dropped zone
srtNewFarm.DataSource = p;
srtNewFarm.DataBind();

}

Hope this helps someone
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ‘

Azure IaaS VHDs, Disks, and Images

Ah! a comprehensive writeup

Microsoft Azure MVP Mike McKeown's Blog

Within Azure IaaS exists the often overlooked, yet important, subtleties of Virtual Machine (VM) Disks and Images.  Both of these allow you to generate Azure IaaS VMs using differing strategies.  In this post, I will clear up misunderstandings on the relationship between the two and how they map into the VM and VHD model for Azure.

Creating and Uploading VHDs

Azure VMs use Hyper-V fixed format VHDs.  So if you upload a dynamically formatted VHD it will be converted to fixed format. This may cause an unexpected bloat during the upload and yield a size greater than the 127 GB disk size limit imposed for Azure IaaS VM images. Thus it is recommended that you create a fixed format VHD before you upload it to Azure and don’t rely upon the conversion from dynamic to fixed format during the upload process.  VHDs can be uploaded using a PowerShell cmdlet Add-AzureVhd

View original post 973 more words

SharePoint 2013 and the missing “Sign in as Different User”

sharepointblank

Seems like the option to sign in as a different user is missing with SharePoint 2013, but fear not, it is quite easy to bring it back. All you have to d is to navigate to

  • \15\TEMPLATE\CONTROLTEMPLATES

and open the Welcome.ascx file in a text editor of your choice. Once opened, find the entry called “ID_RequestAccess” and place the following code right in front of it:

And magically, it’s back:
signin

 

UPDATE:

Just stumbled upon two alternatives, which work without modifying the Welcome.ascx file. Read it on Nik Patels Blog, he provides a link to the Microsoft Knowledge Base. Basically it tells you how to modify the site-URL or to start your browser as another user. Might be handy when in a hurry or when there is no possibility to change the welcome.ascx. So follow this link.

View original post

Octopus and TFS Build :Continuous Integration-Deploy -Part 2

Hello folks ,

Those of you wondering why this post is called ‘part 2’ , find out PART1

Setup Octopack and Nuspec

In the previous post,we had setup continuous integration , but for Octopus  a mere dll is not enough, it needs NuGet. In our example we will use a toll called Octopack. Once the code is setup for CI , install the octopack with ‘Install-package Octopack’ command  via package manager console .What we also need to create is a .nuspec file .

copy the below content in a notepad and rename it to the projectname.nuspec .The Id field determines the output NuPkg file.

<!–?xml version=”1.0″?>–>
xmlns=”http://schemas.microsoft.com/packaging/2010/07/nuspec.xsd”&gt;
<metadata>
<id>Demo</id>
<title>Demo</title>
<version>1.0.0</version>
<authors>Your name</authors>
<owners>Your name</owners>
<licenseUrl>http://yourcompany.com</licenseUrl&gt;
<projectUrl>http://yourcompany.com</projectUrl&gt;
<requireLicenseAcceptance>false</requireLicenseAcceptance>
<description>A sample project</description>
<releaseNotes>This release contains the following changes…</releaseNotes>
</metadata>
</package>

Setup output folder

remember in last post , i had asked  “select ‘ This build does not copy output’’ , So if this code is not supposed to have output, we have to build a nuget out of thin air ????

have no fear, open your c#project file (.csproj) in edit mode and add these lines

<PropertyGroup>
<runoctopack>true</runoctopack>
<OctoPackPublishPackageToFileShare>C:\Builds</OctoPackPublishPackageToFileShare>
</PropertyGroup>

Right after the line having  “Octopack’ to run and package it to a c:\builds folder, which is the default folder on the TFS Server for copying builds after msbuild has built your solution.

Then ,right click on your solution and select’ enable nuget package restore ‘

SO, now after all this, time for some results eh ? Check In your solution and  once the build succeeds, browse to the C:\Builds folder, It would contain a.nupkg file.This is the output we need for the final step -DEPLOYMENT

The nupkg file would be of the format abc1.0.0.0.nupkg you can change the version by setting the Assembly version and name by change the ID in Nupkg file

Octopus Server and Tentacle

There is a very nice documentation on the Octopus portal .Install this on any machine ( i set it up on a VS box) and setup Tentacle on the  deployment box .

Tentacle and Octopus are client server kinda relationship which includes setting up a trust(with Guid) called Thumbprint

Nuget Feed

By now, i am assuming  your tentacle is setup and looks like this

octotent

And your Octopus Server, has been setup with Project , but you are confused about ‘Steps ‘ and nuget feed ?octoproj

next we will setup a NuGet Feed like explained Here . I choose to go with a online nuget feed, but you can go for a local feed as well, if everything is happening locally .

Then go to configuration and add the nuget feed with an URL like this ‘ http:/asdasd /nuget ‘

My feed looks something like this .

octofeedsearchoctonugetfeed

Project Deployment Setup

Once this is done we will setup an IIS website on the deployment machine and configure our project steps to ‘ Deploy a Nuget Package’

Nu-get repository : Choose the nu-get feed you had configured earlier

package : give the name of the package which was built using Octopack (without the version number)

Roles : The role you want to deploy to  .

Go to the bottom of the page and set the IIS virtual directory

Your project is ready for deployment .Piece of Cake…..

Release

click on ‘Create Release ‘ and provide a version .Octopus automatically gets the latest nu-get from your feed and sets the version to the nu-get version, but you have the choice to name it according to you .

Next, click on Deploy Releaseoctorelease . This takes you to a page like this .

You have the choice (If you had setup your project with 2 environments ) to deploy it to your choice of environment.

you also have the choice of re-downloading the package, very useful if you Changed your code and build a new package but have not changed the version of the package

Clicking on ‘Deploy release ‘ takes you to this nice ‘live’ log  page which shows the steps being executed,and any error’s which might have occurredoctodeploy .

octonotes

The dashboard would now show a nice ‘ Success’ image .

SO there you go folks , Veni, vidi, vici . Pls let me know your thoughts on the execution of steps or if you have found a better way of doing things 🙂

Hope this helps someone
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ’

Octopus and TFS Build :Continuous Integration-Deploy -Part 1

Recently i was asked to understand Octopus and build a continuous integration environment with the ability to do deployment seamlessly ! . Wow , won’t that be a huge thing , at least in the .Net World (I am a MSFT enthusiast btw !).

hmm so what is this Octopus thing ? looks scary ? heck, it sounds scarier . Nahh , its a gentle giant built by Paul Stovell to do one of the worst parts of an App-development(IMHO) , managing your deployment to different environment \machines along with, running config and changing config files,W:O W

Octopus works in a Client- server kinda relationship , where you have an admin portal on a machine,to manage your deployment and server’s and have clients (tentacle) installed on every machine you want the deployment to happen on . (This is a very crude definition , and I will cover octopus is detailed in subsequent posts)

What do we need to start ?

Install TFS and setup Build Server

So, I got down to business , and the first thing i did was to get hold of a VM and install TFS 2010 on it .Nothing much to explain here , probably I will cover the steps for installing TFS 2010 in another post.

Setup your default collection and Configure a build server

tfsBuildSC
Setting up your Code for CI
So, after setting up our build server, we would like to setup our code for Continuous Integration, so that every time you check in, it builds your code along with running your test cases (we will cover that in later posts)
1. I am taking a sample web application for our demo here.
2.Connect to the TFS and to the Project Collection, for which the build server was configured.
3.Go to TeamExplorer view and click on Builds
tfsbbuildsc 4. Then click on ‘New Build Definition’
5. Give your build definition a name
 – For Trigger , select Continuous Integration
 – Build defaults :set an UNC or do as i did , select ‘ This build does not copy output’ .Why ?I’ll let you know a bit .
 – Process ->Items to Build – > Set the configuration to Release
– Save the Build Definition.
tfsBuildCI1SC
6. SO we have setup our build definition , and now to check if our build definition works ,
 – Right click on the build definition , and click on ‘queue new build’tfsBuildCI2SC
– Once you queue the build, we can see a build running under ‘My Builds’ .Double click on
    it , and a new window opens with the build details .
– If you are running this for the first time, the build might take 2-3 minutes .
-If everything goes according to plan , you should see a success message .
tfsBuildCI3SC
This would have built the code into dll’s and put it up on the TFS Server under  C:\Builds .
From here you can choose to go Octopus or do your own packaging and deployment .
In our subsequent posts, we will handle ‘Nu-get’s ‘ ‘Octo-pack’ ‘Octopus-Deploy’ and ‘tentacle ‘ where we will learn how to package the build and then deploy
Hope this helps someone
‘“Those that know, do. Those that understand, teach.” ~ Aristotle ‘