Monday, November 12, 2012

Now you are Agile, what next ? Lets consider Transactional Analysis (French: Analyse Transactionnelle)

I wanted to have a quick post on Transactional Analysis, which is a  psychology/psychotherapy discipline I heard about for years, 
mainly influenced by Freud, but I never had the chance to investigate into. Well not until tonight.

Scrum / Agile / XP ... has got some set of tools defined (daily stand up, iterations, scrum cards, ...). But those are only recipe. However, as opposed to traditional V-Cycle / Waterfall / PRINCE2, those basic tools are not enough to succeed in a project. What is required then is the soul and the conviction to improve yourself and the team ?

Once you've also got the soul in place, to best achieve some improvements, you now have to 
- analyse patterns of success using Scrum Patterns, or in fact more precisely patterns of failure so that you don't do them, or so that you identify them and get rid of them: (definition that I have initiated by the way)
- consider Dev’s team communication, but also the entire human interaction within the company.

That’s where soft skills tools comes into play, where 2+2 is not necessarily 4, as opposed to computing (except in some cases geeks would say), but it only depends on many external parameters that we have to understand.

Transactional Analysis as one of them, but let also mention Neuro linguistic programming (aka NLP, PNL in French), and a more “simplistic one” is the PCM©, Process Communication Management©)

A French book that I would recommend, is “L’analyse Transactionnelle” René de Lassus © MARABOUT 1991, that I wish to quote 1 page (at the end), that is not meant to summarize the technique, but rather it means to me SOOOOO much in understanding my environment.

It’s worth only 2,75€, so please get it from as I did, or anywhere else.

Also consider having a look at PCM© as it is an even easier principle to understand, but as any soft skills, requires lots of practices and training.

Obviously, even if you are not Agile, those techniques are also useful to understand hunain's interactions !!


Thursday, August 2, 2012


I wanted to share you 1 slide: 
Waterfall vs Agile project's success rate from 2002 to 2010

No comments!

Thursday, July 12, 2012

Send email for free with Windows Azure

From this documentation, here is a way to send emails for free provided it is less than 25,000 emails.
After some verifications from SendGrid that you are not a spammer, you will have access to the SaaS portal:
Configure your account in few steps, then everything is OK to send emails.

From your portal, you have access to the statistics

Emails that are Bounced, Blocked, Invalid, … are presented here:
As a .Net developer, simply add a Nuget into Visual Studio:
SendGrid NuGet package
Within your .Net code, simply use the information you filled in:
And here you go:
// Create network credentials to access your SendGrid account.
var username = "your_sendgrid_username";
var pswd = "your_sendgrid_password";

var credentials = new NetworkCredential(username, pswd);
For more info:
By the way, their Technical support is really quick to reply (less than an hour), based on the SaaS CRM:

Thursday, June 7, 2012

Evaluating Application Release Management tools combined with TFS ALM to deploy into Windows AZURE

[Sorry, duplicated post. See original:]

Evaluating Application Release Management tools combined with TFS ALM

This post aims to start a list (not exhaustive) of Application Release management tools that complements TFS for customizing and delivering complex deployment scenario on the Windows Azure platform.
Here are some options InRelease, Nolio, attunity, UrbanCodeRightScale, New Relic,  Microsoft System Center (or more specific MS SC Orchestrator, previously called Opalis), Octopus, AttunityCodePlex TFS Deployer, MSBuild, MSDeploy, TFS ALM workflow (*) 
(again, we could find a lot more).

Update (July 2012): OpenSource Chef, from Opscode ==> VP of Product Management, George Moberly, demonstrates Chef's integration with Windows Azure at Microsoft TechEd Orlando 2012

[update] Other realease management tools listed here:

I have not assessed them all since it would take ages. However, I started to short list some of them according to our needs & priorities. If you are a vendor, please take a 5 to 10 min survey :
To be fair with all short-listed vendors (I am open to any solution), I will provide them 2 or 3 pages of our needs so that they could convince us with a 1 hour customized demo.
(*) TFS ALM used to deploy in Windows Azure. TechDays 2012 France Mise en place d'une démarche ALM avec Visual Studio pour Windows Azure (ALM206)
Claude Rémillard, the Product Owner of InRelease allowed me to publish his answers. It’s “row” data, so forgive the presentation:
* Company website =
* Product name = InRelease
* User roles are usually supported by the tool:
    .Developer (in a Continuous integration environment)
    .QA for User Acceptance Tests
    .Release manager (approval)
    .Release engineer
    .Non-regression team (approval)
* The application is "Agile" enough to deploy more than once a day
* Support Windows / .Net code deployment
* Requires the installation of an Agent on the targeted on-premise server
    . But Not needed for the Cloud
* Automated provisioning for Microsoft Windows Azure Cloud
    . Possible, but not done out of the box. Complete support for Azure is on our roadmap.
* Support DACPAC database deployment
* Could trigger: Batch or PowerShell or Installshield
    . We support anything tool that can be triggered from a command line
* Each deployment path (e.g. Dev / Integration / Demo_Version / Production) can be composed of re-usable sub-validation-blocks (e.g. path Dev = Dev+QA+STAGING+VALIDATION+PROD , and SalesDemo = Dev+QA+PROD)
* Can be used as Standalone as well as triggered / integrated with TFS 2010
* Has the ability to run in parallel processes
    . Currently, we have sequential steps per server, with parallel servers installation. In the coming months, we will support full sequential and parallel processes.
* Ability to connect to a database for audit trail (who deployed, when, which path, validators...)
* Can send emails to a mailing list
* Can visualize workflow of deployment in real-time during deployment
* Monitoring multiple simultaneous deployments ==> Partial. Can be done, but not all deployments are shown on one screen.
* Can perform pre-deployment checks ? (e.g. sufficient disk space ?)
* Can you capture and report on metrics (e.g. start-end time of each step of the process) ?
* Can you block a deployment if not all approval has signed-off ?
* Can schedule the deployments (date / time), with support restart of the deployment cycle, to recover from service interruption

* Support on rollback ==> Current support for rollback is limited to redeploying the last version that was installed. Plans to support rollback (by providing a rollback step for each installation step) is planned for the coming months.
* Possibility to centralize all configurations / variables (Web.Config, connection String, ...) ==> Different values per environment for variables are entered in InRelease. At deployment time, InRelease will then copy the values for a specific environment in the corresponding configuration file.
* How are those configurations presented to us (eg. table per deployment path, XML file...) ==> In InRelease, in a table per application/environment.
* How do centralize passwords (eg. SQL Server accounts, Service Accounts, Logins, ...) and restrict the access. In which container those passwords are stored (e.g. XML, database, ...) ==>     Password are entered directly in InRelease, encrypted as soon as they are entred, kept encrypted in the database, and only decrypted on the target server where the deployment is made.
* One-time cost ==>    > $ 2001 (No recurring charge)
* Size of the company     < 50
* Number of customers     < 50
* Why we should purchase your product ==> We are a company with a long ALM background and we are very close to our customers. Fast support, we listen to the needs of our customers and are constantly improving the software based on their feedback.
* PS. I saw that you are located in Paris, do not hesitate to communicate with us en français if you prefer!

Saturday, April 7, 2012

More details on Windows Azure deployment lifecycle ??

Beyond having this tiny window in Visual Studio 2010 (with Windows Azure SDK 1.6)
Anyone knows where we could get a detailed information on the Windows Azure deployment Lifecycle ??
Indeed, we have nice articles, such as:
but it there any diagram such as :

Note: In this example, the deployment time I provided are real, but based on a slow Internet connection (using a USB Dungle Internet)

Tuesday, April 3, 2012

Lesson learned: ALM with SQL Azure and new tools (SQL Server Data Tools and latest DACPAC Data-Tier Application project)

The number of tools, either in CodePlex, in CTP, or MS released, or Third-party to manage the Windows Azure are numerous. To add some more complexity, the very same tool (e.g. SQL Management Studio), allows you to perform the same action but using many possible ways. Thus the degree of freedom is huge. However, not ALL combination works or sometimes not documented at the moment, or you have to find a KB to understand why it does not work the way it should.
This post provides an easy way of migrating your on-premise database into SQL Azure, and use the latest Microsoft tools to have a better SQL ALM than ever before.
After having migrated your on-premise database, by completing the first 4 steps from my previous post (the other steps just provide a ways to check the integrity of your migration), you’ll need to create a DACPAC project. However, this time you’ll be using the following latest Microsoft tools:
* SQL Server Data Tools [free], that extend your Visual Studio 2010 capabilities (check all pre-requisites, which could take up to 1 hour of install). You could consider using Visual Studio 11 Beta… but only for SQL Azure / DACPAC projects (Indeed, Windows Azure bit is another story with VS11 Beta).
This extension allows to work with *.sql file in a better way, as we will see later.

* SQL Server 2012 Management Studio [currently, get it SQL Server 2012 and uncheck everything, except Management Studio].
I would recommend not using SQL Management 2008 (unless you want to have some frustrations). indeed, with the latest version of Data-Tier Application projects it presents some minor limitations/ (bugs ? It’s a feature ? Smile) and its working principles has been ‘improved’ in “SQL Server 2012 Management Studio”.
To create a new type of Data-tier Application project, called “SQL Server Database project” within Visual Studio 2010, follow step 5 to 7 of my previous post mentioned above (as a reminder, you need to install SQL Server Data Tools).
You could also let “VS2010 with SQL Server Data Tools” migrate an existing VS2008 DB project:
In the new type of DB project’s property, you should target the “SQL Azure” platform, and set the Data-tier Application properties to a suitable version number (this number is used by the “intelligence” of DACPAC that is now handed by SQL Server 2008 R2 or 2012. We don’t deploy any longer “stupid” *.sql files).
It could be noted that by using this new type of DB project, you could decrease your amount of *.sql files considerably. For instance, going from 800 files to 100 files, depending on the complexity of your database. This is due to the fact that the new one is organized differently:
* No more huge amount of crazy sub-sub-sub-sub-folder and numerous files for every single index, constraints, etc… . Here is an example of a old VS2008 DB project applied to a very basic and small project (ASP.Net membership provider):
* Now, everything is nicely grouped into a more “SQL-server-like” fashion way. On the top panel below (1), you’ll find a visual SQL designer of a Table with its associated constraints and properties, and on the bottom pan (2), you’ll find the plain-text version like the old fashion way… except that your constraints, indexes, … are now part of the same *.sql file.
The gain in the number of files also means less files to get/check-in and track in terms of source control.
You could notice also few variations in the “Build” options on the *.sql file’s properties (3). However, unless setting additional deployment files to Build=”None” or “Build Extension Configuration” I did not manage to make it work properly (as in VS2008) when using related *.sql files called from a master *.sql file using the “:r” option.
With this type of project, by targeting “SQL Azure”, which represents the most constraining scenario (because your T-SQL has to be cleaned-up), you can publish either On-Premise or on SQL Azure using the very same T-SQL (you might add some IFs if you really need specific SQL instructions on SQL Azure).
As far as Publishing is concerned, you could define a configuration file per environment (e.g. click (1) to load various profiles, such as on-premise staging,on-premise validation, localhost, SQL Azure staging,  SQL Azure  PROD, …). To get the new publish window (notice it is not called “deploy” anymore), right click on your VS DB project > “Publish…”, somehow wait for 2 to 10 seconds before the modal pop-up appears:
Ensure you have checked the 2 options (2) to benefits from the latest DACPAC type of deployment, and simply click (3) [which also means you could deploy the old fashion way by configuring the numerous options on Advanced…].

During the publishing, you’ll see a new output window that keeps it nice and tidy as opposed to the previous plain text version. It reminds me the Publish for Windows Azure, … but in a more usable way ! (i.e. no tiny reports with a tiny scrollbar and when a bugs occur, you don’t know what happens). I like this new “Data Tools Operations” output window :
          - it has many “view details” links to open a full-size window when needed
          - it could be collapsed (such as Win Azure deployment)
. It also seams to deploy/publish and generate data a lot faster that its previous version (may be due to better caching).
Now if you want to deploys into SQL Azure, simply load the corresponding config file, then here you go, it’s on SQL Azure in 1 click ! Now you could use a command line to perform that so that a TFS-Build could automatically publish that for you (I would not recommend, since it will start to impact considerably you Windows Azure bill if you publish continuously 10 or 20 times a day for instance).
Example of deployment time a database with less that 100 MB data to be populated
- on local DB : 5 sec
- on SQL Azure (broadband): 4 min
- on SQL Azure (USB Dungle Internet, and no broadband): 15 min
The last new feature that I am going to present to really ease your ALM is the “Snapshots” possibilities. Before any Staging or Prod deployment, keep track of the exact deployment package that you delivered and stores it as a Snapshot. It will store you project as a DACPAC file within seconds (recall: a DACPAC file is a Zip file of you DB schema stored as XML files), which should then be stored into you favorite code repository. In TFS, for example, it could be a TFS-Release-Branch.
The new possibilities allows a huge gain in productivity !!!
Although I have been using those tools on large projects, I would be really interested in anyone having further practical feedbacks, or may be different usage.

Friday, February 17, 2012

[Azure] Migrating your SQL Server 2008 Database to SQL Azure

They are many ways you could do to migrate your SQL Server 2008 database (SSIS bulk copy, Data Sync, dedicated tools, …). Here is one of them, that was quite painless, and free of charge (at opposed of using commercial tools).
Because the Windows Azure world is moving really fast, this method might be already out of date as you are reading !

  1. Let us assume you have VS 2010 and SQL Server 2008 R2, and you have an existing DB that you are happy with,
  2. Install and use “SQL Azure Migration Wizard” tool to connect to your existing database. It will generate you as T-SQL Script will errors that would occur in SQL Azure (you could find all the infringement rules alongside this CodePlex project),
  3. Modify your T-SQL accordingly to fix highlighted errors and generate a new database. Let call it DB_SQL_AZURE_COMPLIANT,
  4. Migration DONE !! (either let the tool correct for you or do it your self). This tool could even push your database (without its data) directly into SQL Azure.
    The steps after are to enable you to work in a secure manner and ensure your project is still compatible with SQL Azure, and provide a link to show how to also push your data (using BACPAC),

  5. With VS 2010, create (or use an existing) a SQL Server Data-tier Application image
  6. Reverse Engineer you SQL database into Visual Studio Database DAC project using "Import Data-tier Application...", and connect to your new DB_SQL_AZURE_COMPLIANT database
  7. You will have a VS 2010 DB project that is able to generate DACPAC v2.0 deployment file (basically, it is a ZIP file that contains schema of you database, without any data. As opposed to BACPAC that is a extension of DACPAC, where a BACPAC also contains you data as JSON format). More info on DACPAC format:
  8. Now you have a DB project that is most likely be able to be deployed into SQL Azure (you can be certain only once you have migrated everything). You could verify that by running again the “SQL Azure Migration Wizard” tool and deploy into SQL Azure. You now should have no errors.
  9. To double check it is really compliant, we are going to use our VM,
  10. Mount your VM and configure your network card (so you could access it vis MSTSC)
  11. Make a Snap shot of your VM (in case you want to rollback)
  12. Copy/Paste you DAC project (created with VS2010) into the VM and open it with VS11, and confirm the you want your project to be migrated into a VS11 DB project,
  13. Because you’ve done a great job correcting the T-SQL Errors, and you have great Stored procedures, views, … and no errors, you could activate ALL rules CodeAnalysis (below) with confidence !!
    Also, ensure you have a Data-tier Application (.dacpac file) checkbox is selected, and that the “Target platform” is “SQL Azure”.
  14. Install Silverlight 5 add on the VM. This is because the existing one on the VM is old and will not support new SQL Azure web portal
    (generally, unless you know what you do, do not install SL5 on your DEV machine if you are working on SL4, else you would spend hours uninstalling. Indeed, it will break your SL4 dev environment)
  15. Connect to your Windows Azure web portal, and go to “Database” (this is in Silverlight 4)
  16. Click “Manage” this will open the Web portal to manage SQL Azure
  17. The opens a new Web portal to manage SQL Azure (in Silverlight 5)
  18. Now you could play with your newly database migrated into SQL Azure (type in plain T-SQL and run it). This web portal contains MANY MANY MANY great functionalities that you should discover ONLY progressively (because it looks nice, but a bit messy !!)
  19. The final step is to populate your database with data (simply use a T-SQL script as a quick hack). Indeed, again DACPAC only deals with schema and for data, this is done using BACPAC:
    But this is another story, based on other Windows Azure platform components and further CodePlex projects ! 
By the way, with a large database, I never managed to make the Microsoft “SQL Azure Compatibility Assessment” tool working. seams really promising, but it always tells me that my DACPAC is of an incorrect format, and that I should use a SQL Server Data Tools (CTP4) to generate it.
But how come ?? unless my version of SQL Server Data Tools installed in my VS11 is out of date ! In which case, I must again run after the latest tool and latest news concerning Windows Azure.