Pages

Wednesday, November 16, 2011

How Alerts in AX work with templates and emails

Alerts in AX can be very useful, and I wanted to touch on how they work, along with some explanations, tips, and tricks.  Much of this post is basic, but there are some caveats explained as well.  I mainly use alerts to notify a user when a field or record is created, modified, deleted, etc.  A user specifically wanted to know whenever a credit limit changed for a customer...so this is the scenario I will be covering.

You first need to make sure you have an alert email template setup.  This can be done from Basic>Setup>Email Templates.  Here you will create your header alert record, and your line records are for different languages should you need them.



After you create the basic one as pictured, you click the "Template" button where you define your template.  I suggest putting in the following HTML code under the HTML tab from http://technet.microsoft.com/en-us/library/aa834376(AX.50).aspx

You then need to configure the alerts parameters at Basic>Setup>Alerts.  You want to choose the "Alerts" email id template you just created.


The "Drill-down target" can be confusing...but here is my interpretation of it.  You basically put something different for each of your AX environments (Dev/Test/Prod).  When you have an alert email sent automatically with hyperlinks, it will be something like Dynamics://DEV... (instead of http://DEV...) that will be clickable, and open an AX client and go directly to the alert or the alert's origin.

When you click the link, it basically tries to go to your default client config, and then it confirms that the "DEV" found in the URL also is found in the alerts setup.  If it is not, it just doesn't open the client.  So if you are getting alerts from Dev/Test/Prod, only the ones that have the correct Dynamics://[this part] will work.  This obviously means you must have a local client installed.

The types of alerts:
  • Changed based alerts
    • Alerts that occur when something changes, like a customer name, or customer credit limit
  • Due date alerts
    • Alerts that occur when something is due, typically workflow related

These both have batch jobs, located at Basic>Periodic>Alerts that should be running in order to actually create the alerts that meet the rule criteria.  If you aren't getting the alert bell pictures in the lower corner of your session, then one of these is probably not running correctly.

Now let's setup the alert rule to notify us when the credit limit has changed for a customer.  You can set alerts on anything pretty much you right click on.  Go to your customer details form, then the general tab, then right click on the credit limit field and click "Create alert rule" as pictured:


Now base AX only lets you alert one user.  You can create a user that has a distribution list for an email account, or you can do some simple modifications in X++ to support overwriting the alert email, or alerting multiple emails/users.  For now, you can just click OK here and then close the window behind it.

If we go to a customer and change their credit rating, an alert will be generated (provided the batch is running as stated earlier or it is run manually).

For the email portion, the way alerts are setup is to have traceable emails.  So what it does is dump it into an email sending queue that can be found at Administration>Periodic>E-mail processing>E-mail sending status.


This emailing subsystem is very handy, and open for many useful modifications if you're feeling creative.  Emails in this queue are picked up for delivery by the e-mail processing batch, which is located at Administration>Periodic>E-mail processing>Batch.  This must be running for the emails to go out.

You can view sent emails with the body of the message, and also resend them should you choose to do so.

There is also a retry schedule that is located in the same place where you can set your retry intervals for failed email attempts.


The framework is here behind the scenes to open it up for some very useful modifications.  For example, if you periodically email customers a survey, and you want to send them reminders to fill it out, or you are waiting for any other sort of user input and you want to send emails until they perform their action.

This just briefly outlines and touches on alerts and emails.  If you have any questions, feel free to comment.  I will be elaborating on email templates later on.

Tuesday, November 15, 2011

How to call a form from code and make it modal

This is how to open a form from X++ and set it modal.  Modal means it is on top of the other windows and you can not click on the ones behind it.  I have this just in a clicked event on a button.


void clicked()
{
    CustTable   custTable;
    FormRun     formRun;
    Args args = new Args();
    ;
    
    // Just selecting a random record in CustTable
    select firstonly custTable;

    // This is the name of the form you are opening
    args.name(formstr(CustTable));
    
    // This is the optional record you are passing it
    args.record(custTable);

    // This builds the formrun
    formRun = ClassFactory.formRunClass(args);
    formRun.init();
    formRun.run();

    // This waits until the form closes
    if (!formRun.closed())
        formrun.wait(true);
}

Monday, November 7, 2011

Changing dimensions on an item the easy way using xRecord.merge?

Many of us have hit the same issue at some point, where a dimension needs to be changed on an item, and going about doing it can be a very big hassle.

This post is purely experimental, and I am currently assessing the feasibility of it, so please be very critical of it.  I'm looking for feedback on issues I've not yet thought of.

A project I'm working on is changing/removing the serialization dimension on many different items. The customer has gone from AX 3.0 to 4.0 to 2009, so there is a good share of old/bad data, old open transactions, etc.

Our goal is to turn on "blank issue allowed" and "blank receipt allowed" to make transfers/movements easier.

There are two basic ways that I can think of to change the dimensions on the items, and these are very brief steps that would actually require more thought/elaboration in a real scenario:

  • Scrap out all inventory, close all open transactions, then change the dimension
    • Difficulty: High
  • Create a new item with desired dimension, transfer old item to new item, then rename primary key to get the ItemId back to old value
    • Difficulty: Low, but you lose item history, so this often won't work.

I tried to get creative and think out of the box on this one a bit because the amount of work required to close all open transactions is huge, plus could mean several days, maybe even weeks of downtime because of the amount of transactions constantly going on.  This customer can not be shut down for more than a few days too.

My idea is to create a new item with the desired dimension, then use the Kernel xRecord.Merge functionality to merge the item with the old dimension into the newly configured item, then use the primary key functionality to rename the put the ItemId back.  I just tried this yesterday, so I've barely looked at system functionality beyond some simple tests.  The transactions came over, the BOMs came, etc...so it has some promise.

Some initial thoughts where this might have some holes...since we are moving the old item into the new item, the new item will have a new recId.  So if there are any tag tables that reference by RecId, this would clearly cause an issue.  I believe if the correct relation is setup though, that it might propagate correctly.  Tables like the Address or DocuRef (document handling) however that use commons and reference by Table and RecId might have some issues.  I have yet to test this.

There are some other minor steps that will need to be done in this code, such as confirming that those tables I delete have the previous values passed over...things such as item price.  If you have other custom code, you may need to add/remove some of the tables I deleted.  The discovery process on this was pretty simple, I ran it, then the validate would fail and I'd delete records out of the table, and rerun until it worked.

The "Master" item here is the new item with the new dimension.  The "child" item is the original item with the dimension we want to change.

Anyway, here is the proof of concept code you've been waiting for.  Please comment with thoughts/concerns/experience:




static void ItemMerge(Args _args)
{
    ItemId      iMaster = '26101A'; // New item with newly configured dimension
    ItemId      iChild  = '26101';  // Original item with old dimension

    InventTable master;
    InventTable child;
    CUSInventTable          t1;
    InventTableModule       t2;
    InventItemLocation      t3;
    InventItemPurchSetup    t4;
    InventItemSalesSetup    t5;
    ConfigTable             t6;
    ;


    ttsbegin;
    delete_from t1
        where t1.ItemId == iMaster;

    delete_from t2
        where t2.ItemId == iMaster;

    delete_from t3
        where t3.ItemId == iMaster;

    delete_from t4
        where t4.ItemId == iMaster;

    delete_from t5
        where t5.ItemId == iMaster;

    delete_from t6
        where t6.ItemId == iMaster;

    child = InventTable::find(iChild, true);
    master = InventTable::find(iMaster, true);

    child.merge(master);
    master.doUpdate();
    child.doDelete();
    ttscommit;

    info("Done");
}

Thursday, August 4, 2011

How to change a customer's party type

I often see customers that are created wrong in client's systems.  They will have a party type of "organization" when they should really have "person" or vice versa.  The difference between the two, as far as I can tell, is that organizations are permitted to have multiple contacts, and other little things like that, while person types are supposed to be simpler.

Here is a little static method I wrote to change their party type.  One thing to worry about, when you go from Organization to Person, you may want to check if there are multiple contacts before you make the change.  Not exactly sure what it will do.




static server boolean changeCustPartyType(CustTable _custTable, DirPartyType _dirPartyType)
{
    CustTable       custTable;
    ;

    if (_custTable && _custTable.PartyType != _dirPartyType)
    {
        ttsbegin;

        custTable.selectForUpdate(true);
        custTable.data(_custTable);

        custTable.PartyType = _dirPartyType;

        custTable.setNameAlias();

        DirParty::updatePartyFromCommon(custTable.PartyId, custTable, DirSystemPrivacyGroupType::Public, true);

        custTable.doUpdate();

        smmBusRelTable::updateFromCustTableSFA2(custTable);

        custTable.setAccountOnVend(custTable.orig());

        smmTransLog::initTrans(custTable, smmLogAction::update);

        ttscommit;

        return true;
    }

    return false;
}

Wednesday, June 22, 2011

Dynamics AX 2012 Upgrade thoughts

I've been working on upgrading a demo environment of AX 2009 to AX 2012 to get the experience under my belt.  Like any good upgrade, it takes careful planning and execution to be performed properly.  I'll be blogging my adventures through this upgrade as I go, so check back for additional posts.

I have a good deal of AX 4.0 to AX 2009 upgrade experience and AX 2012 upgrade is definitely different.  From AX 4.0 to AX 2009, you basically did three things (not exactly this order):
  • Merged all of your customizations against the new sys/syp layers
  • Extended the ReleaseUpdateDB41* classes if needed to upgrade data
  • Ran the upgrade scripts to upgrade fields and move data around

With AX 2012, there are much more pre-upgrade validation steps where your business data is prepped and validated more closely.  This can make more up-front work, but ideally it will be a more seamless upgrade.  The downside is that it is much more of a team effort initially to upgrade, involving financial decision makers in your organization and a few developers.  I do see the benefit of reducing the potential for error.  The upgrade to 2009 did more blind, data dumping, while 2012 validates/examines what its doing.


I've been working for about 6 hours now hitting a few bumps, mainly because I don't have the answers to some of the financial bits of the upgrade.  Some helpful upgrade whitepapers that were recommended to me can be found here.  My environment is Microsoft's Refresh 3.5 with no layers other than SYS/SYP and a dash of USR.  I just added a couple modifications to see how the new process would work.  The demo data however is not perfect, and because it's a demo company, a good deal of the business inputs required for the upgrade will be guesses by me.

The first steps of the upgrade have you import a giant XPO (9.43 MB) into the USR layer that prepares the system for upgrade.  The first step is a "Check upgrade readiness", which is essentially a bunch of batch jobs that do all sorts of validation on your system.  I had 1228/1229 complete with one erroring.  The results of the completed jobs seemed a little daunting.  Be prepared to have your controller handy.


This step is not required, but let's face it...if you don't get past most of this stuff, it's almost guaranteed you will face issues down the road.  The issues it brings up are for the most part, legitimate issues with data that do need to be fixed.  I chose to ignore most of the errors as I didn't want to spend a long time guessing my way through fixing them, when I don't really know the answers.

The next step "Initialize preprocessing", I think just creates or fills these "Shadow_*" tables in the AOT used to store upgrade data.

Then you have 22 steps to complete, with varying difficulty.  Some of these steps seem like they could have had a "best guess" option.  For example, I had to setup country/region code mapping.  AX 2009 was on ISO 3166-1 alpha-2, and AX 2012 is going to be on ISO 3166-1 alpha-3.  Meaning 2 letter country codes to 3 letter country codes.  This seems like something that can auto-populate.  You also must configure it for each company.  For me, this was 54 country codes such as "US" that I had to map to "USA" across each company.  Just a little time consuming.  Better safe than sorry though.

Other parts of the upgrade require the DAT company to have a number sequence, ledger account, and other small bits of data filled in to proceed.

The errors can seem cryptic, and if you have any bad data in your system, the errors will easily lead you astray.  They are pretty easy to determine via code and the debugger what is causing an error to be thrown.  For example, the demo data had some strange unit conversions for a specific item in one of the companies setup.  The company had 0 items, so the conversions were just bad data.  The errors were just catch-all type messages.  Some quick debugging and it was pretty easy to track down.

I've put in a few long nights after work, so I'm taking breaks in between so I don't wear myself out.  My next steps are:

  • Preprocess data on live system
    • Run live preprocessing scripts
    • Country/region upgrade
    • Run delta preprocessing scripts
  • Preprocess data in single user mode
    • Enter into single-user mode
    • Run single-user mode preprocessing scripts
The single user mode stuff I believe basically bulk copies the prepared data from AX 2009 to the AX 2012 system and finalizes it.

The perk of this is supposed to be a shorter downtime to actually go live.  Stay tuned!

Tuesday, April 26, 2011

Generating better random numbers with dynamics ax 2009

Inspired by this post, I created a neat little way to generate good random numbers between a range.

If you use AX Random class or RandomGenerate class to generate random numbers, you will find that if you do several in a row, they turn out fairly sequential.

The xGlobal::randomPositiveInt32() does a slightly better job of producing a random integer.

I would sometimes see patterns using this when the output is sorted:



static void Job6(Args _args)
{
    int total = 20;
    int i;
    ;
    
    while (total)
    {
        i = xGlobal::randomPositiveInt32();

        /*
        // This is better
        while (i > 9999999)
            i = i div 10;

        while (i < 1000000)
            i = i * 10;
        */
        
        while (i > 9999999)
            i = i >> 1;

        while (i < 1000000)
            i = i << 1;

        info(int2str(i));
        
        total--;

    }
}

Sunday, March 6, 2011

How to iterate over addresses in the global address book of Vendors/Customers

I'm frequently asked to mass update addresses for vendors or customers during data imports when something isn't imported correctly.  Here is a simple job that I wrote that shows the relationship between the tables, that you can modify to fit your needs.



static void iterateOverAddresses(Args _args)
{
    VendTable                           vendTable;
    Address                             address;
    DirPartyAddressRelationship         dpar;
    DirPartyAddressRelationshipMapping  dparm;
    ;

    while select address
        join dpar
        join dparm
        join vendTable
        where
              vendTable.PartyId         == dpar.PartyId                         &&
              dpar.RecId                == dparm.PartyAddressRelationshipRecId  &&
              dparm.RefCompanyId        == curExt()                             &&
              dparm.AddressRecId        == address.RecId
    {
        info(address.Name);
    }
}

Saturday, March 5, 2011

Enterprise portal security issues with Refresh 3.5 and AX 2009

I wanted to play around with a well-built demo environment so that I could experiment with some workflow ideas I had, so I downloaded Microsoft's Refresh 3.5 VPC from partnersource (located here).

I prefer to work as a typical user with typical security settings so that I can identify issues quickly.  After booting up the VPC and restarting it a few times so that it would adjust to my machine, there were quite a few changes I had to make to get it usable.

Firstly, I had to change the client configuration to point to AX593 and the business connector configuration as well.  If you forget the business connector, you will have EP issues, amongst other things.

I randomly chose contoso\Nancy as my testing user, and I immediately notice that I don't have access to EP at all:

So I switch back over to contoso\administrator to take a look at the security settings in EP and notice there really isn't anything setup:

Let's add Nancy to the viewer's group and see what happens...


So now we can at least see the role centers, but the KPIs are having an issue...hmm...this issue is typically related to Kerberos, but on the Refresh 3.5 VPC, it's only NTLM security because everything is on the same machine.

Let's remove Nancy from the viewers group, and just create our own "AX Users" group, but let's make it "Read" only instead of "View only", add Nancy to it, then refresh our role center:

And we're good.  The only difference between View Only and Read is the ability to view the source of documents with server-side file handlers.  See Sharepoint permissions matrix here for security differences.

Friday, March 4, 2011

Undoing the Financial Dimension Wizard

During the initial setup of AX, I've seen where dimensions in one environment don't match the other environment. One place will have one called "Brand" and another will say "Brands", or "Event/Place" vs "Place/Event", for example. This can cause big issues, and be a bit of a pain.

To fix this, what I recommend doing is deciding which environment is the good one, and dropping the dimension code down to the bad one. You'll then need to drop the good database down to the bad environment too...otherwise data will look screwy all over the place.

Fixing data is another discussion, and just reloading/replacing the database is the easiest.

Replacing the code is easy. This screenshot should give you all the information you need to fix the dimensions in the AOT:



  • Tables\LedgerJournalTrans
  • EDT\COSAllowDimensions
  • EDT\Dimension
  • EDT\DimensionAllocation
  • EDT\DimensionCriteria
  • EDT\DimensionKeepFromTransaction
  • EDT\DimensionLedgerAllocCriteria
  • EDT\DimensionLedgerJournal
  • EDT\DimensionPriority
  • EDT\MandatoryDimension
  • EDT\XMLMapDimension
  • BaseEnums\SysDimension
  • Forms\LedgerAllocation
LASTLY!!!
Labels need to be fixed.  The dimension wizard creates labels for these items, I believe in the DPA label file.  Just check on the objects and see what their labels are.  You can open the axDPAen-us.ald file in notepad (US English) and take a look there too.

Wednesday, February 16, 2011

Fixing addresses that are not formatted correctly

An issue that comes up often at various clients is addresses being formatted incorrectly.  Functional people have been able to fix this by "touching" the zip code on the address with an issue...basically they click on it and re-choose the zip code and save to fix it.

What is going on behind the scenes, is the AddressMap is being used to correctly format the address.

The Address.address, CustTable.address, VendTable.address, etc. fields should not be manually filled in.  They should not have data dumped into them either.  It is a derived/built field that depends on the country code and each individual format.

This is important when you deal with international customers and customs.  Customs for European countries, for example, can be very picky with documents.  If the address is even slightly malformed, I've seen product get held for up to a month, and sometimes it's seized...all for a silly address error.

Here is a job I wrote that can be adapted to automate the process of building addresses using AddressMap.  Right now, it lets you pick a customer that you want to correct the addresses for, then displays a before/after of the addresses.

Enjoy!  As always, comments are welcome!



static void updateCustAddress(Args _args)
{
    Dialog                              dialog;
    DialogField                         field;
    CustTable                           custTable;
    Address                             address;
    DirPartyAddressRelationship         dpar;
    DirPartyAddressRelationshipMapping  dparm;
    AccountNum                          accountNum;
    ;

    dialog  = new Dialog("Build customer address");
    dialog.addText("Select your customer to update:");
    field   = dialog.addField(typeid(CustAccount));
    dialog.run();

    if(dialog.closedOk())
    {
        accountNum = field.value();

        if (CustTable::exist(accountNum))
        {
            ttsbegin;
            while select forupdate address
                join dpar
                join dparm
                join custTable
                where custTable.AccountNum      == accountNum          &&
                      custTable.PartyId         == dpar.PartyId         &&
                      dpar.RecId                == dparm.PartyAddressRelationshipRecId  &&
                      dparm.RefCompanyId        == curExt()               &&
                      dparm.AddressRecId        == address.RecId
            {
                info(strfmt("Type %1 - before:", address.type));
                info(strFmt(">%1<", address.Address));
                
                address.AddressMap::setAddress();
                address.update();
                
                info(strfmt("Type %1 - after:", address.type));
                info(strFmt("<%1>", address.Address));
            }
            ttscommit;
        }
        else
        {
            info("Invalid Account");
        }

    }

    info("Done");
}

Tuesday, February 8, 2011

Change form's background color for Dev/Test/Prod

This is a simple mod that saves TONS of headaches.  There are really three main ways to use this:

  1. Change color to signify dev, test, or prod environments
  2. Change color to signify which company is active
  3. Change color to signify which layer you're logged in to
Just put this code in Classes\SysSetupFormRun\Run to change the form's background color.  Surround it with and if ( ) block to make it change by layer or company.


Monday, February 7, 2011

How to use TreeNode to navigate objects in the AOT

During a code merge into an environment managed by another company, I came across some tables with "UNKNOWN" delete actions in their production environment.  This worried me a little bit, so I whipped up a quick job to search every table for unknown delete actions to see what the real damage was.  This demonstrates how to use TreeNode to traverse the AOT.  The reason I used "subStr" for a portion of the code was I also searched for "DEL_" delete actions.



static void findTablesWithSpecificDeleteAction(Args _args)
{
    #AOT
    #Properties
    TreeNode    rootNode;
    TreeNode    childNode;
    TreeNode    tempNode;
    ;
    rootNode = infolog.findNode(#TablesPath).AOTfirstChild();
    while (rootNode)
    {
        childNode = rootNode.AOTfindChild('DeleteActions');
        if (childNode)
        {
            tempNode = childNode.AOTfirstChild();
            while (tempNode)
            {
                if (subStr(tempNode.AOTname(), 1, 3) == "UNK")
                {
                    info(tempNode.treeNodePath());
                }
                tempNode = tempNode.AOTnextSibling();
            }
        }
        rootNode = rootNode.AOTnextSibling();
    }
}

Sunday, February 6, 2011

How to use a tree form control, recursion, and containers to view nested containers

Everybody likes something tangible (well...digitally tangible) that you can download, import, and learn from.  This is a single form I developed that demonstrates the use of tree form control, recursion, and nested containers.

I expanded on a previous post about using recursion with containers specifically for the purpose of looking closely at usage data.  Look closely at the comments, as there is some good info there.  You can pass any container you want in this form.

Usage data explained

Usage data is incredibly important to the end user, and very delicate as far as data goes in my opinion.  I'll expand on why I think it's delicate later on in this post.  This post will detail what I've discovered about it, but is limited due to the fact that usage data is system level and mostly hidden.

As a developer, I am clearing my usage data regularly without giving it a second thought.  I once made the mistake of clearing all of a user's usage data when they were experiencing a strange issue with a form.  They lost all of their stored queries, user-specific form customizations, ad-hoc reports, pre-filled in data, etc.  Let's just say I haven't made that same mistake twice.


Why I researched usage data:
We have some BUS-layer IP (intellectual property) that we have sold at a client.  It was sold with the understanding of both parties that it is in development, and a very living/breathing product.  They received a big price break for being the guinea pigs on the software.  Anyway, some objects were heavily customized, such as the SalesTable form.

We started phase 1 on a version IP 1.0, and we were live and operating while phase 2 was developed.  We deployed phase 2 successfully, from a technical standpoint.  The system looks exactly as I, the developer, would expect, however many users were complaining that all of their Intellimorph customizations to certain objects (SalesTable) were gone.

How usage data is stored:
Usage data is stored in a single hidden system table called SysLastValue.  It has the following main fields
  • userId
  • recordType - UtilElementsType::*
  • elementName
  • designName
  • isKernel
  • company
  • value - Container, this is where the important stuff is
Where usage data gets difficult:
Usage data, according to MS, is different for pretty much every object.  I like to think of it as a custom, system level pack/unpack.  So unfortunately, this means you really can't just modify usage data without knowing the underlying code structure.  I personally think that there is some sort of framework for these objects once they're past the sys layer.  Like a custom pack method for SalesTable with a generic pack-other method for any custom modifications.

What you can do with this info:
You can backup usage data and copy it to other users.  If you have some custom Intellimorph modifications to a form, for example, you can run a quick job to copy your saved settings to every user in the system.  You can make a "golden user" of sorts where you make modifications to forms, then copy to other users.  I'll be posting a neat code example shortly on how to use recursion, containers, and tree nodes to view usage data shortly.

Disclaimer:
This is what I've come to understand of usage data from my experiences only.  My viewpoint is limited because the code is unavailable.

Formatting C#/X++ code to be viewable in blogger

This is a very useful website for other bloggers to convert their code.  I'm honestly posting this as a plug for them, and because I keep losing the link.

http://www.manoli.net/csharpformat/

A second one I like is below:
http://hilite.me/

Recursion with containers

Here is a little job that demonstrates recursion with containers.  I used a heavily modified version of this to analyize the way usage data worked.




static void containerRecursion(Args _args)
{
container look = ['ABC', ['1', '2', ['99', '88']], 'ZXY'];
container recurse (container _c)
{
int         i;
int         len = conLen(_c);
container   retVal;
;
 
while (i < len)
{
i++;
 
if (typeof(conPeek(_c, i)) != Types::Container)
{
retVal += conPeek(_c, i);
}
else
{
retVal += recurse(conPeek(_c, i));
}
}
 
return retVal;
}
;
 
info("Missing values: " + con2str(look));
info("Not missing values: " + con2str(recurse(look)));
 
}

Thursday, February 3, 2011

Zip codes, postal codes, states, counties, countries of the world!

Surprisingly, it is very hard to find a good download of zip/postal codes, states, countries, etc.  Many places charge for this up to date information.  Where I work, there is often an excel file being emailed around the office with zip codes that were harvested from some other system that we dump into clients' systems.  It's sort of a copy-paste mash of different countries that we've slowly built and it's mostly there.

Well a somewhat unknown site that is available under a creative commons attribute license that is incredibly comprehensive is GeoNames.org.

What I did is go to:

Grab the latest allCountries.zip, import into excel (tab delimited), then throw some filters on there and use it as your master country/zip/state/county/etc list.

The site is a little difficult to navigate and they have many giant data dumps...for example, there are two "allCountries.zip" files I've come across.  One has the good stuff (ISO country, zip, state, county, etc), and the other has stuff that I can't really find a use for (population, elevation, etc).

Hope this saves somebody else a headache when they get a client who does business globally!

Saturday, January 29, 2011

How to export license keys that are already loaded in your system

Every now and then, there is a need to export license keys from a system.  I needed to get into the VAR layer to modify code in a Microsoft VPC, but as we all know, Contoso license keys don't come with the BUS/CUS/VAR access codes.  So I did what any developer would do:

  • Write a job to export the Contoso license key file
  • Import my company's partner license key file
    • DO NOT SYNC when it asks
  • Jump in the VAR layer using our layer access code to make my changes real quick
  • Import back the Contoso license key file I exported from my job and SYNC

Disclaimer (ripped from MS)
All code used below is meant for illustration purposes only and is not intended for use in production. The following disclaimer applies to all code used in this blog:

THIS CODE IS MADE AVAILABLE AS IS, WITHOUT WARRANTY OF ANY KIND. THE ENTIRE RISK OF THE USE OR THE RESULTS FROM THE USE OF THIS CODE REMAINS WITH THE USER. USE AND REDISTRIBUTION OF THIS CODE, WITH OR WITHOUT MODIFICATION, IS HEREBY PERMITTED.


Here is said job, enjoy!

static void OutputLicenseToText(Args _args)
{
    #define.licenseVersion(2)
    #define.KeywordLen(20)
    #define.keywordLicense('License')
    #define.keywordProperties('Properties')
    #define.keywordCodes('Codes')
    #define.keywordCodeLine('CodeLine')
    #define.keywordDate('Date')
    #define.keywordSerial('Serial')
    #define.keywordValue('Value')
    #define.blank('')
    #define.space1(' ')
    #define.space2('  ')
    #define.space3('   ')
    #define.spaceHash(' #')
    #define.OutputFilename(@'C:\OutputLicenseKeys.txt')

    #define.keywordInfo(1)
    #define.keywordWarning(2)

    SysConfig           sysConfig;
    SysLicenseCodeSort  sysLicenseCodeSort;
    container           fileOut;
    int                 i;
    System.IO.StreamWriter  sw;
    InteropPermission perm = new InteropPermission(InteropKind::ClrInterop);
    ;

    fileOut += "LicenseVersion "    + strfmt("%1", #licenseVersion);
    fileOut += #blank;
    fileOut += #keywordLicense      + #spaceHash + xSysConfig::find(ConfigType::LicenseName,0).Value;
    fileOut += #blank;
    fileOut += #space1  + #keywordProperties;
    fileOut += #space2  + "Name"            + #spaceHash    + xSysConfig::find(ConfigType::LicenseName,0).Value;
    fileOut += #space2  + #keywordSerial    + #spaceHash    + xSysConfig::find(ConfigType::SerialNo,0).Value;
    fileOut += #space2  + #keywordDate      + #spaceHash    + xSysConfig::find(ConfigType::LicenseName,1).Value;
    fileOut += #space1  + "EndProperties";
    fileOut += #blank;
    fileOut += #space1  + #keywordCodes;

    // Build CodeLines
    while select sysConfig
        where sysConfig.configType  == ConfigType::AccessCodes   &&
              sysConfig.value       != #blank
        join sysLicenseCodeSort
        order by SortIdx
        where sysLicenseCodeSort.Id == sysConfig.id
    {
        fileOut += #space2  + #keywordCodeLine  + #spaceHash    + int2str(sysConfig.id + 1);
        fileOut += #space3  + #keywordValue     + #spaceHash    + sysConfig.value;
        fileOut += #space2  + "EndCodeLine";
        fileOut += #blank;
    }

    fileOut += #blank;
    fileOut += #space2  + "EndCodes";
    fileOut += #space1  + "EndLicense";

    // Begin file output
    perm.assert();

    sw = new System.IO.StreamWriter(#OutputFilename);

    for (i=1; i<=conLen(fileOut); i++)
    {
        sw.WriteLine(conPeek(fileOut, i));
    }

    sw.Flush();
    sw.Close();
    sw.Dispose();

    CodeAccessPermission::revertAssert();

    info("License successfully output to " + #OutputFilename);
}

Installing Microsoft Dynamics AX 2009 Refresh 4 on Windows 7/Vista

Microsoft has released yet another wonderful VPC that showcases the entire stack of Microsoft products working with Dynamics AX 2009 (Can be found HERE with PartnerSource/CustomerSource).  Unfortunately...microsoft says:

This virtual machine is configured for Windows 2008 Hyper-V and can’t be used with VPC (or Windows 7 Virtualization)
You can get Refresh 3.5 to work as a VPC...but I'm going to show you a neat way to get Refresh 4 running as a dual bootable Dev environment.

This HowTo assumes you have downloaded Refresh 4 and extracted it and you have AX5-W8R2-01.vhd and AX5-W8R2-01_DB.vhd available.

Start by right clicking on computer and clicking "manage".

Go to Storage>Disk Management, then click the action menu at the top and click "Attach VHD".  Attach both VHDs.  You will need the AX5-W8R2-01_DB.vhd to be the F: drive, or once you mount it, you will need to copy over the files and file structure to the F: drive of your machine.  This is because this SQL instance looks for its database on the F: drive.

After you've added these, all you need to do is create a boot entry to boot the VHD's.  This can be done with the command line tool BCDboot.  If your Windows partition of the VHD is now on the E: drive for example, you just need to type from an elevated command prompt:
>bcdboot E:\windows
This will add a boot entry.  To see your work, just run "msconfig" and click the boot tab.  There you can set the default OS you'd like to boot, the wait time (I changed mine to 5 seconds), etc.

All that's left to do is restart the machine and make sure you select the Server 2008 R2 instance.  Expect the following:

  • First boot will be slow because all of the drivers are trying to sync to your new hardware
    • This means your desktop might not show up for several minutes
  • Page file issues initially (not sure why these occur, but I just ignore)
  • Some drivers you might have to manually download and install specific to your machine.  The only one I had to install was my graphics driver so that it would detect my two monitors.
This is nice because Refresh 4 requires a good amount of horsepower to run (recommended min of 4GB memory, dual CPU), and you can devote your entire machine to it.  Also dual monitors, etc feel nicer when it's not in a VPC window.

Happy DAX'ing!