Quantcast
Channel: Waldo's Blog Microsoft Dynamics NAV & Business Central
Viewing all 336 articles
Browse latest View live

Microsoft Dynamics NAV Managed Services for Partners

$
0
0

Not a lot of time to blog about the next big announcement, announced at the second day of Directions EMEA, because I’m actually preparing my session still a little bit.. . I should, because network connection is a problem, and I was kind of depending on it.. :(.

What is the big announcement:

Microsoft Dynamics NAV managed Services For Partners

Now, what is it? Well, it is: Tooling to enable you to not think about the cloud infrastructure, and

In short: “We build it, Microsoft runs it, we sell it”.

Or in my simple words: Microsoft is going to host your database in the cloud.

Or in yet other words: you don’t need to care about cloud infrastructure and installation/montoring services anymore.

In my opinion: they solve one of the biggest hurdles we as partners had to deploy to the cloud.

Dmitry Chadayev showed how it looks like in a demo.

The portal is clearly built in NAV (we are already very familiar :-). You get a clear overview of the SaaS solution that we (as partners) provide to our customers.

He shows how we can make available our product, and how they are also making available all platform updates that Microsoft ships .. . We are even able to do some (basic?) branding of our product, changing the splash screen and such.

Dmitry create a new application, with the possibility of uploading template data and all that.

The things we upload are Azure SQL databases in bacpac format. It’s easy to convert stuff to bacpac, so that shouldn’t be a hurdle – may be interesting to create a simple script for it (I’ll look into that ;-)).

Too much to blog the details Dmitry showed .. But it really looks good, slick, easy, .. Simple! What we expect from Microsoft, right?

Marko mentions that there are already customer running live on this service, like quite some customers from GAC (the Netherlands).

How to provision tenants was almost childish .. So easy. It was just clicking “New”, defining a few fields, like Name and country .. . Next, adding users (obviously) AND THAT’S IT! A new tenant gets created with the template data from the backpack we provided while setting up the application .. . Wow!

This is going to kick ass. Great stuff.

“The management portal is super easy to use. Our deployments are now handled by the sales team, without the need for a technical resource. We can get customers up and running with a login email in 6 minutes from when they sign up”

A quite from a partner that are already running 50 live customers on this service (might have misunderstood), though. Well, I can’t see our Sales department doing this, but it is easy, super easy, I really agree!

Next, Dmitry shows how an upgrade is managed .. Simply by moving tenants to your newly (upgraded) application service. That’s !t. Obviously, you need to have everything in place (upgrade toolkit, codeunits obviously)! The services even manages the extensions! There are even services available to debug and even dive into SQL diagnostics!

Next, also backups are also possible, including downloading backups of tenants .. .

There was a lot more – too much to mention.

OK, I have a problem now .. My mouth is wide open and now I need to find a way to close it again. Wow! Jaw-breaking!

Next, something about licensing .. And a big announcement was that Microsoft will allow 1 license per application. We will be able to add, and remove users and a lot more. So customers don’t have to wait to add users, and all the time it took to add or change the license. Good feedback from partners, and Microsoft listened! :-). Next, we will only get 1 invoice per month, for everything, with all details available.

Now, what about pricing– will, I’m not going to put that on a blog – don’t think that would be wise, but I’m sure you all will have your channels to get pricing information asap!

A few things worth mentioning is the partner profile ..:

  • Cloud – the services are only for cloud business, not on premise
  • Multitenant – the solution needs to be multi-tenant!
  • Repeatable – and repeatable. You will need to have a number of customers on there!
  • Scale

So, your business model needs to be repeatable, volume-oriented, low cost of sale and annuity (whatever that means..)!

Now, I know that there are a few limitations still – they were not announced, so I will only mention them in next blogs from the moment they get confirmed, but all this is a Microsoft Service! Which basically means, we really can expect this is going to grow .. And limitations at this point, will really soon become new features in the futures ;-).

To conclude, Marko announced already the roadmap of the next version of Microsoft Dynamics NAV, codename “Madeira”. I don’t know whether it’s wise to already mention too much of it, but there is definitely a focus on an even tighter integration with Office 365, where we could see NAV customer data in an email from that customer – talking about integration! Same integration for vendor, and we can even create purchase order, starting from an email conversation with that vendor – or create new contacts of NAV doesn’t find any contacts from an email.. .

That’s it from your reporter from DirectionEMEA ;-).. again, this blogpost is “as is” .. Including typos and misinterpretations! Nevertheless .. I’m excited! :-)


Microsoft Dynamics NAV 2016: What’s New

$
0
0

With the new release of Microsoft Dynamics NAV 2016, there is a lot to blog about.  Let me start by trying to list down all the new features in this new release. Though, I’m actually quite sure I’m not going to be able to, because there are just that many .. and as Vjeko mentioned in his recent blog post, there are many things that were not announced explicitly, but are also in this new release. OK, here it goes …

For Consultants

There are quite some functional additions to the product. Now, I didn’t follow these sessions at Directions EMEA, and I’m not a functional consultant at all (focusing on technical parts is already hard enough ;-)), but let me try to list a few application additions to the product:

Power BI

Finally, Power BI is supporting Microsoft Dynamics NAV .. . This means that when you enter your powerbi.com, you have a new service available, which is called “Microsoft Dynamics NAV”, like you see here:

This would imply that we finally have a decent integration, including data refresh! As far as I know, this is crucial, but was not possible until now!

Download Exchange Rates

Exactly what it says – The application is now able to download the exchange rates and add it in the Exchange Rate tables. Can’t imagine this hasn’t been built by many partners – we had our own solution for this for years!

OCR Service (Lexmark)

Another service Microsoft Dynamics NAV 2016 uses, is an OCR Service from Lexmark, and is integrated in the “Incoming Documents” functionality.  As such, you’re able to send a scanned (or picture) purchase document, and the OCR will try to extract valuable data from the picture.

Integration with CRM

 You probably all have experimented with the old CRM Connector, and came to the conclusion: it sucked! No flexibility, not stable, … we tried to implement it a few times, but always had to turn to third party solutions, which is hard to sell for customers that buy 2 Dynamics products.. .

Now, with NAV2016, there is finally a decent way to connect NAV with CRM by adding a new type of table, being a CRM table. In a way, we are able to connect to a CRM Online table, which means we can code against it like it was a normal table in NAV. So the entire CRM integration is actually just coded in C/AL. Which means: readable, extendable! In 10 minutes, you can have an out-of-the-box connection with CRM Online.  And a few minutes later, you can have even a customized CRM Connector, that matches your customer’s needs.

Improved Web Client

The Web Client has been improved on more than 60 points.To only mention a few:

  • Tabs are extractable
  • You can switch companies
  • You can switch languages

Universal App

We all knew the tablet app, and loved it, right! Well, now there is a “Universal App”, which means: one app for all appliences. It serves as a tablet app (in tablet mode), on a phone it acts as a phone app .. Yes indeed! A PHONE APP! Which looks something like this:

Workflow

One of the biggest new things in NAV! Based on Eventing (see later in this post), you are able to create your own workflows. Configurable as you like it. You’re able to create your own workflow-events (either in code and using the workflow framework, or using pre-configured workflows) with conditions, and responses (like mail, notification or whatever).

I have been blogging about it here, but that’s just a small post about a big feature. I strongly recommend to dive into it. It’s a great framework, and you can apply lots of things on it!

The only remark might be is that you’ll probably have to add code to make it work like you want. But that’s ok. It should be treated as a framework instead of a complete solution anyway (in my opinion)!

For Developers

There are also quite some changes and additions on platform and development level. Let’s try to go over them:

Application Test Toolset

I’m very excited about this addition. The Test Toolset has been out there for years – but now it’s supported, extended and released for all supported countries. The framework has been extended, and out-of-the-box, there are about 15000 tests available!

Ability to use Camera and GPS in C/AL

It is even possible to code against the camera and GPS in G/AL! Just look at the how-to’s in the help:

  • How to: Implement the Camera in C/AL
  • How to: Implement Location in C/AL

Eventing

This is a few days of topics to explain this. Eventing is something very common in the .Net world, but now, with Eventing in NAV, we can benefit from it as well. Eventing consists of “publishers” and “subscribers”. A publisher raises an event in code when something happens, like “OnBeforePostDocument” in codeunit 80. A subscriber is a function that subscribes to an event. In a way, we can create a function, add the property “subscriber” to it, and assign it to a specific published event where we want to assign the function to. This would mean, when the published events gets executed, all the “attached” subscriber functions get executed as well.

Customization Extensions

Also something that is difficult to explain in a few words. A Customization Extension can be seen as an enhanced fob file which only contains delta’s. This is a very rudimentary description, but really .. If you hold yourself to a few limitations (which will be blogged about a lot), you are able to create a so-called “navx”-file. This navx file contains information/deltafiles/permissionset/.. , and can be published and installed to other databases.

It can even be installed on specific tenants, which makes it possible that you do specific customizations to specific tenants in a multitenant environment.

Again, this is just a short description, but actually it’s a huge thing! I have had a session on best practices about these extensions at Directions EMEA, and will be doing the same session on Directions US. A lot more to come about this topic!

The new Code Editor

Great for developers: finally we have code completion and intellisense as coding editor within C/SIDE – already in the year 2016! :-). I won’t go too deep into it. Vjeko has explained this extensively on his blog.

Now, if you would – for some weird reason – prefer the old code editor, there is still a way. Just start the finsql.exe with the option “useoldeditor:yes”, like:

And then the new shortcuts which everyone loves:

  • CTRL+G opens the “Globals” window
  • CTRL+L opens the (guess what ..) “locals” window
  • CTRL+Z is undo (and you can do multiple :-))
  • CTRL+Y is redo

A non-NAV developer is not amazed, I know, and he shouldn’t be – but we are, aren’t we :-).

Posting Preview

Another finance addition. In a way, we’re able to see what a posting we result into. The result is a kind of “navigate” window with the different entries that it results into. Like:

You can drill down to each entry.

TryFunction

A try-catch for NAV! Yes, really! One thing to notice, though .. When a Try-function fails, the transaction is NOT rolled back! This is something to keep in mind! This is by design. You catch the error, not the transaction!

Microsoft Dynamics NAV Managed Service for Partners

THE big secret. Microsoft is taking the infrastructure-challenge for setting up multitenant cloud services based on NAV out of our hands. This was a big hurdle, so this service is a big help for partners. I truly believe that. I haven’t got any details yet on how to start using it and such, or any screenshots, but I’ll try to figure out.
a few remarks on this one may be: it is only intended for multitenant solutions, and you HAVE to upgrade to a new Cumulative Update at least each three months!

A few small additions to NAV2016 as mentioned by my colleague on his blog (more info to find there):

Multiple Namespaces in XMLPorts

The ability to add multiple namespaces

Automatic Deployment of Add-Ins

We can now add an Add-in as a zip file to the Add-in table, and NAV will make sure they get deployed to the clients, when necessary.

64 bit client

Which will result in faster processing for memory-intensive operations, like (for example) reports.

Timestamps

Like an AutoIncrement, this BigInteger field with type “Timestamp=Yes” will maintain itself .. On each modify, a new timestamp value will be added to the record. This is great for interfacing, to find out which records have changed since a specific timestamp.

FOREACH

Only for .Net objects, but very useful! Used to loop .Net collections!

Record.ISTEMPORARY

Really good addition to be able to test a certain varaible wheter it is Temorary or not.. . I think we were only able to do this on a RecRef – will, now we can do it on a Record-Variable as well! Thanks, Soren, to get this added!

Record.RECORDID

Same for the “RECORDID” property – we do not have to use the RecordRef anymore to get to the RECORDID.

Some known issues

As any major release, it comes with a few pitfalls. I want to be thorough, so I don’t want to keep things from you. Here are the known issues at this moment:

  • Installing NAV2016 will overwrite the 2015 administration tool. Arend Jan blogged about that here: http://kauffmann.nl/index.php/2015/10/07/nav-2016-installation-overwrites-nav-2015-administration-tool/ . He also mentioned how to fix it!
  • Apparently, the GetDatabaseTableTriggerSetup is executed every time it enters a table, instead of only once per table. This has been mentioned on Twitter, but I have not had the time to confirm it yet, though. This might result in some performance issues!
  • In the Customization Extensions there are a few issues that will be fixed in the next CU, which are causing the data upgrade to fail:
    • Ability to work with AutoIncrement
    • Ability to work with FlowFields

As far as I know, all issues that were reported will be fixed in Cu1. But that’s not a promise ;-), that’s an assumption (which is mother of all….).

OK, guys and girls! Hope you liked it, hope it’s somewhat complete. Probably not. If not, please drop me a comment!

Fix NAV2015 Administration Shell after NAV 2016 installation

$
0
0

You should read Arend-Jan’s blogpost first.

He explains that after you install NAV2016 on a machine that already had NAV2015 on it, that it breaks the NAV2015 Administration console. Arend-Jan also provide a fix for this! A really easy one even.

But he was also talking about “if you’re not a PowerShell guru” .. And such. Now, I’m not a guru, but I sure like to “fix” things purely in PowerShell. So – since I needed the fix myself – and since I’ll probably have to apply this fix a numerous times – I was wondering whether I could come up with a simple “run with PowerShell” fix for this issue.

And I believe I have a second solution, which is going to execute all the steps that Arend-Jan already described .. but all within PowerShell. Just copy/paste the script below in a PowerShell ISE, execute it, and you should be good to go. On top of that, you might be able to include this script in any provisioning-script that you might already have .. . Don’t know – it’s up to you and the script is “as is” ;-).

Thanks Arend-Jan for sorting this out ;-).

$BaseReg90 = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', [net.dns]::GetHostName())
$RegKey90  = $BaseReg90.OpenSubKey('SOFTWARE\Microsoft\MMC\SnapIns\FX:{BA484C42-ED9A-4bc1-925F-23E64E686FCE}')

$BaseReg80 = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('LocalMachine', [net.dns]::GetHostName())
$RegKey80 = $basereg80.CreateSubKey('SOFTWARE\Microsoft\MMC\SnapIns\FX:{BA484C41-ED9A-4bc1-925F-23E64E686FCE}')

foreach($RegKeyValue90 in $RegKey90.GetValueNames()){  
    $value80 = $RegKey90.GetValue($RegKeyValue90)

    if ($value80 -match '\\90\\'){
        $value80 = $value80 -replace '\\90\\','\80\'
    }
    if ($value80 -match 'ManagementUI, Version=9.0.0.0'){
        $value80 = $value80 -replace 'ManagementUI, Version=9.0.0.0','ManagementUI, Version=8.0.0.0'
    }
    if ($value80 -match 'BA484C42-ED9A-4bc1-925F-23E64E686FCE'){
        $value80 = $value80 -replace 'BA484C42-ED9A-4bc1-925F-23E64E686FCE','BA484C41-ED9A-4bc1-925F-23E64E686FCE'
    }

    $regkey80.SetValue($RegKeyValue90, $value80)
}

foreach($RegKeySubKey90 in $RegKey90.GetSubKeyNames()){
    write-host $RegKeySubKey90

    $RegKey80.CreateSubKey($RegKeySubKey90)
}

$msc80file = "${env:ProgramFiles(x86)}\Microsoft Dynamics NAV\80\RoleTailored Client\Microsoft Dynamics NAV Server.msc"
(get-content $msc80file).Replace('ba484c42-ed9a-4bc1-925f-23e64e686fce','ba484c41-ed9a-4bc1-925f-23e64e686fce') | Set-Content $msc80file

 

NAV 2016 Eventing: All published Integration and Business events

$
0
0

It’s been a while since I’ve put up a decent blogpost. Well .. I’ve been busy at conferences, to be honest doing lots of Sessions. And I’m always nervous for these – I take it seriously, you know – people pay a lot of money to come to conferences like that, and the sessions need to be good. So .. I do my best .. and hope to succeed ;-)..

Best Practices for NAV Extensions

One of my sessions was about Best Practices on the exciting new feature in NAV 2016: NAV Extensions. You will read all about it in the near future, when I can find time to blog about the content.

One of the tips that I showed was how to find all Integration and Business events in the product. Well .. you know me .. I created a PowerShell script for that :-). People have been contacting me for this script, so here you are:

$DistriText = [System.IO.File]::OpenText('C:\Temp\AllObjects.txt')

$ResultArray = @()
$CurrentObject = ''
$i = 0

for(;;) {
    $TextLine = $DistriText.ReadLine()
    
    if ($TextLine -eq $null) { break }

    $i++

    switch ($true)
    {
        {$Textline.Contains("OBJECT ")} {$CurrentObject = $TextLine.TrimStart()}
        {$Textline.Contains("  [Integration")} {$CurrentEventType = $TextLine.TrimStart()}
        {$Textline.Contains("  [Business")} {$CurrentEventType = $TextLine.TrimStart()}
        {$Textline.Contains(" PROCEDURE") -and !([String]::IsNullOrEmpty($CurrentEventType))} {
            $CurrentFunction = $TextLine.TrimStart()

            $MyObject = New-Object System.Object
            $MyObject | Add-Member -MemberType NoteProperty -Name Object -Value $CurrentObject
            $MyObject | Add-Member -MemberType NoteProperty -Name EventType -Value $CurrentEventType
            $MyObject | Add-Member -MemberType NoteProperty -Name Function -Value $CurrentFunction            
            $ResultArray += $MyObject

            $CurrentEventType = ''
        }             

        Default {}
    }

    Write-Progress -Activity "Reading through objects.." -Status $CurrentObject

}

$DistriText.Close()
$DistriText.Dispose()

$ResultArray | ogv

 

The concept is simple: you just feed it a complete textfile of all objects of a certain database, and it will give you a list of events in a gridview.

If you want the official list of Microsoft on MSDN, you can find it here (thanks to Joe Mathis to send me this link). I don’t know how well this is being maintained, but hey .. now you at least got two resources ;-).

Enjoy!

Download Waldo’s PowerShell Modules

$
0
0

Yesterday, I did my session at NAVTechays, being “Thinking out of the box with NAV Development”. Now, I will be blogging about this great conference and about the session, but not just yet. In this post,I would like to shortly focus on one thing that I addressed during that session.

My Powershell Modules

The past two years, I have been doing a LOT of PowerShell in my free time. During waiting for airplanes, IN the airplanes, in hotels, … even during sleepless nights (which are usually most productive ;-)). This has resulted into a set of functions that actually make my daily life much easier, like:

  • Creating an ISO from a Cumulative Update
  • Install NAV
  • Copy-ing a default instance to another test-instance
  • Convert it to Multi Tenancy
  • Merging
  • Data Upgrade
  • Taking backups, restoring and destroying databases

Too much to address.

Now, starting from yesterday, this powershell functionset if available for you on Github to download. Github is a great platform for it, recommended by Kamil :-). I can post updates, and you can track what I update, even compare on code-level, to make up your mind whether you want the new version or not ;-). It also has got a functionality to report issues .. an opportunity for you to report any bugs to me (there will be… I promise .. so please do report them)

Why

The reason why I do this is simple. First of all I’m a community guy (duh .. ;-)) .. but also I did quite a lot of workshops and sessions the past two years, and the main impression that I got is that PowerShell isn’t very well adopted by the NAV community just yet. While it really can make your life so much easier on a daily basis.

Now, I admit it only started to be really really interesting to me when I started to get this function set together … so I truly believe that with this set, you will be able to start much quicker with PowerShell (if you already didn’t..).

Disclaimer

It’s out there .. download it, use it, get ideas from it, try it out, use it in your daily life! Though, one big disclaimer. The download is “as is”. Without any support. I can promise two things: it will make your life easier, but also: there will be bugs! Please report them in github, so I can have a look at it – but there are no promises. This is still a side-project for after working hours, after all ;-).

PowerShell workshop ahead

$
0
0

It’s not of my habits to do advertisement on my blog. Even more: I hate advertisement on any blog – so especially mine. But I don’t get any money doing this – so no benefit here – but there is value…

There is namely one reason why I did want to mention this upcoming PowerShell workshop on the 1st and 8th of December.

The reason is simple: the past two years I have been evangelizing PowerShell in the Dynamics NAV community. And either I’m not good in evangelizing anything, or I just didn’t reach enough people. In any case, I still notice a tremendous lack on both PowerShell knowledge and PowerShell usage in the NAV community. So that’s why I would like to bring this workshop under your attention.

The workshop is done by me in Veenendaal (Holland) in two parts: one day on the 1st of December, en the next day one week later on the 8th. This is done so people can familiarize themselves with PowerShell for a week.

It’s a workshop that fellow-MVP Luc Van Vugt organizes with Areopa, and you can find more information about it here.

If you want to register, just send them a mail.

C U there!

NAVTechDays 2015: Final thougths

$
0
0

Time for the traditional wrap-up of NAVTechDays. This has become a tradition, so let’s not break this tradition and let’s share some thoughts about the conference of what I have been calling “the best NAV conference there is” ;-). Well .. is it still?

As every year, NAVTechDays was held in one of the biggest movie theaters in Belgium: the Metropolis (from Kinepolis) in the “beautiful” city of Antwerp. Yes, I do use quotes, because for me, it’s just the city closest to me – nothing really exciting anymore for me ;-). But for many people, it’s the city of diamonds, or beer, or chocolota, or .. the city where they throw hands (I admit, that sounds somewhat less appealing .. ).

Great tradition, because let’s be honest – for all the years of NAVTechDays – the location was one of the main factors which made NAVTechDays typically “NAVTechDays”:

  • The best speaker infrastructure
  • The best seats ever
  • The best food
  • The best .. Uhm .. everything!

So why change a winning team.

I wouldn’t want to be in Luc’s shoes. You know – Mr. MiBuSo. He just has such big expectations to live up to. But I guess – it’s because these expectations – these series of perfections – that NAVTechDays is a growing success. I mean: as far as I understand, Antwerp is a disaster to travel to (the worst traffic ever, far from the airport, …), but still the number of attendees is growing e-ve-ry year! This year: about 950 attendees from 42 different countries … THAT’S ALMOST 1000. I’m sure we’ll hit it next year ;-).

Pre-Conference Day

For the third year, there was a pre-conference day. This is a training-opportunity for the attendees that want tot spend an extra day, and learn “from the best” (I need to put this between quotes, just because I was one of the people that were giving a workshop ;-)). This year, 180 people attended one of these workshops:

Power BI for Dynamics NAVSteven Renders
Working with PowerShell and NAVwaldo
Document Reporting in NAV 2015Claus Lundstrom
Troubleshooting Essentials for SQL Server and Dynamics NAVJörg Stryk
Let it build! Kamil Sacek
NAV ALM using TFSLuc Van Vugt
Implementing the Role Tailored client with SuccessPeik Bech-Andersen
installing and configuring SQL Server for Dynamics NAVAlain Krikilion
Developing controll add-ins for Dynamics NAVVjekoslav Babic
NAV on Azure & in Office 265Arend-Jan Kauffman
An introduction to ScrumVincent Bellefroid
Generic, Repeatable & Low Footprint coding techniquesSoren Klemmensen& Gunnar Gestsson
Microsoft Dynamics NAV Application Architecture and Design PatternsMark Brummel

As you see, I had my own little workshop about PowerShell (duh, what else ;-)), and this year, I had the pent house:

A meeting room with its own kitchen, restrooms, … everything .. and .. with this view:

Conclusion: PowerShell rewards you in many ways ;-).

Unfortunately, I had the impression again that doing PowerShell for one day just isn’t enough. PowerShell needs a lot of excercise, lots of practice .. like any other new language you come accross. So it’s just very difficult to be very efficient in one day. Now, good news – Luc is thinking about having 2 pre-conference days next year! That would be great news for PowerShell!

The conference

The next two days, it was time for the conference. And as every year, it’s quite a “different” event as all others .. In many ways. Just because it’s in a move theatre, the infrastructure in all aspects is perfect: the biggest screen ever (I mean: look at this picture)

.. the best seats, presenter-infrastructure like cloned screens, switch for multiple laptops, … and recordings! All sessions are recorded and made available on mibuso here. Only at NAVTechDays ;-). Everyone can download and benefit from all the sessions that have taken place this year .. more then 18 hours of video!

The nice thing at NAVTechDays is that Luc tries to invite knowledgeable people. And really – may be except for myself  – he really succeeds in that. People from the field, lots of MVP’s shared their experiences in their main expertise. Me personally liked very much the Web Services session from Arend-Jan and Mark. AJ had the best examples to illustrate web services. Really useful .. do check out the presentations on Mibuso!

One downside at this year’s NAVTechDays was the WIFI. I think we can all agree. There hardly was any WIFI – or even wired internet connection for that matter – in the rooms .. and even during some sessions. This makes sessions somewhat challenging – especially when they’re using (or are about) the cloud. But then again, speakers should prepare for this (screenshots on slides for example) as this is not a very uncommon thing to happen on a conference with 1000 attendees. Luc did all he could to provide the best infrastructure – dedicated speaker network, even dedicated wired network. But sometimes Murphy isn’t playing very fair, I guess..

My Session: Thinking Outside the box with NAV Development

Luckily, during my session I didn’t have one problem with internet. Then again, I really didn’t use it that much. My session was about “Thinking outside the box”. I came up with the idea when I was watching “Bear Grylls” on Discovery Channel, as he mentioned that he needs to think outside the box when surviving any remote environment .. . Luc loved the idea and he gave it a shot.

I must say, I was very surprised with the amount of people. They kept pooring in – I had a room capacity of 750 people, and I heard that many couldn’t find a seat, and had to sit on the stairs and stand in the hallway.

What was it about. Well, the topic can be about anything, right .. so I decided to talk about:

  • Using PowerShell in daily life
  • Do’s and don’ts about NAV Extensions
  • Some hacking-abilities with NAV (to point down a big security risk when you configure NAV in the wrong way)
  • A small tip on how you can generate online help from a word document – useful for CfMD
  • NAVMgt: an internal tool at my company that we use to manage all our installations: internal and external

The only comment I got was “there was a lot of PowerShell”. Well, I guess that’s true .. and somewhat unwillingly but not surprisingly… There is no way around, guys. PowerShell is something you’ll need to look into, because a lot of problems can be solved with it .. It makes your life so much easier if you want to. So when talking about best practices, it’s normal PowerShell pops up regularly.. . Anyway .. that was just in my defense ;-).

During this session I “released” my PowerShell scripts that I’ve put on GitHub. I already blogged about it here. Use them – really. This week for example I have been doing 6 upgrades with my company (including data) .. and I must admit, the scripts work! During the next few blogposts, I will guide you through the functions that are in the scripts .. a lot to come ;-).

The recording of my session can be downloaded here. And Luc has made it available on the MiBuSo channel on Youtube as well:

Hope you enjoy ;-).

If I’m getting the chance to do a similar session next year, I’ll sure look into the PowerShell-topic, and make sure there is a better balance between PowerShell and “something else” ;-). But I do want to thank Luc for this opportunity. It’s a lot of work, but it’s so rewarding to do sessions for NAVTechDays. The best audience ever! :-)

Some impressions

So, I think we can speak of another successful NAVTechDays! And yes – it’s still the best NAV conference around in my opinion! Can’t wait for next year, to be honest. Here are some impressions of the conference. First of all, the traditional video of a fellow MVP Kai Kowalewski (hope I have his name right).

And of course some of the pictures of the event …

The speakers (don’t know what Gary was doing though.. :-))

A LOT of MVP’s (there are a few missing that also attended NAVTechDays though):

Me during my session .. NOT talking about myself

Me with a small part of my team :-):

Wall-e and Eve:

Daniel enjoying a “bolleke” from Antwerp .. and even taking pictures of it:

42 Countries, 953 attendees!

The people in my session …

And the unavoidable PowerShell:

Arend-Jan doing is Web Services thing:

My best picture on the session room – look how gigantic it is!

Speaker’s shirts:

How to install Waldo’s PowerShell Modules

$
0
0

A while ago, I announced the “release” of my PowerShell scripts and functions here. It’s a big thing for me, as lots and lots of efforts and time have been put into it (and still), but more important .. I’m convinced it can save you lots of time.

Now, I only announced it, and for people that are not that familiar with PowerShell, it’s not that obvious on how to get going. Let me try to introduce you to the modules, and how to make it available on your system.

To start with, you can download the package from my GitHub here: https://github.com/waldo1001/Cloud.Ready.Software.PowerShell

You can either download the zip, or fork into it using GitHub (if you have an account, of course). Downloading the zip is obviously the easiest, as it doesn’t require any GitHub knowledge or installation. But if you fork it, you can contribute to this, and we can be one big happy PowerShell family

Modules

I make a distinction between functions and scripts. When you look at the master folder, you’ll see a bunch of numbered folders – these are my scripts! And there is one folderPSModules” – there are my functions, which are categorized into 4 modules:

  • Cloud.Ready.Software.NAV: this module contains functions that has anything to do with NAV. It needs no explanation that this module will contain most functions
  • Cloud.Ready.Software.PowerShell: contains typical PowerShell functions. At this moment, only one function is in there: that is to manage confirms (yes/no questions).
  • Cloud.Ready.Software.SQL: contains functions that has anything to do with SQL, like backups, restores, invoke any query .. things like that
  • Cloud.Ready.Software.Windows: has the typical windows-related functions, like zip and unzip, working with streams, files, ISO, .. .

So, a lot of code for you to consume. Lots of the NAV functions are dependent of the other modules, so I advise always to install all 4 modules!

How do I install

Well, you can copy and extract the files to virtually any place on your system. I even put them on my dropbox, so that I can make them easily available on all my systems – it’s just a matter of enabling that system with my dropbox and I’m done – all scripts are up-to-date :-).

When copied and extracted, you need to browse to the PSModules folder, where you will find the installation scripts.

  1. InstallModule.ps1

    This is obviously THE installation file to make the modules available to the system. It’s going to update your PSModules-path, and make sure PowerShell will always find these modules. On top of that, it’s going to load the modules, so you don’t need to do it any more ;-). Easy peasy! After this, you’re good to go, nothing else needed to work with the module. The following 2 scripts are just for your convenience.

  2. CreateProfileInISE.ps1

    This one is only useful in PowerShell ISE! The script is going to add some menu buttons to your ISE (scripting environment). When you run this, it’s going to add some code to your profile-script, and you’ll end up with this:

    It’s just there, because I like to use it this way: easy to load the NAV commandlets, and to other things I like to have behind a button click. If you don’t want it, just don’t run the script!

  3. LoadModules.ps1

    This is the last script in this folder, and quite useless only in one particular situation: when you notice PowerShell isn’t running the last version of your changed function (assuming you changed one of my functions..), then just run this script, and it will force to reload the modules. When you did step 2, this function is behind the last menu item.

You’re good to go!

That’s it! You’re good to go, and use the scripts AND functions whenever you like to! It’s indeed just a matter of running one script (the first one) and you’re good to go.

If you want to have an overview of all the functions that are in the modules, then just run this commandlet:

Get-Command -Module ‘Cloud.Ready.Software.*’

Tracking updates on GitHub

Very likely, more functions are coming, and functions or scripts get updated. If you’re interested in these updates, just monitor my commits on GitHub :-). With each commit, you can see what has been changed, even look at the file and figure out whether it’s interesting for you or not ;-).

Enjoy!


Print any document (any extension)

$
0
0

When reading Mark’s blog about printing PDF, it reminded me about a request I got not so long ago, which was quite similar: “I want to be able to easily print any document that is linked in my system, from NAV, … “.

But you know the challenge: if you want to print a PDF, then you have to have some kind of PDF printer (Bullzip, Adobe, whatever…). If you want to print a word document, you need word, or any kind of dll or software that can print the extension.

So I was looking for a generic solution, without any custom dlls, that always works, whenever, however, with any software, as long as you have “any” software installed that can print that “any”-document with any extension .. . And else, it should just give an error.. .

Now, you must have noticed .. if you right-click on a pdf, you get a menu, like this:


This means that the operating system knows it can print the document.

So – in a way – if the operating system know, NAV should be able to know as well .. .

Now, it didn’t take me long to figure out. It’s just quite easy in the .NET Framework. In fact, it seems that when you start a process, you can add “verbs” to the process to specify what you want to do .. . One of these verbs is “Print” :-). So it’s just a matter of starting a process of this file with the verb “print”. And if the “processstarter” finds an application that is associated to this extension, it will use that program to print :-).

Here is the code:


And the vars that I used:

NameDataTypeSubtype
ProcessStartInfoDotNetSystem.Diagnostics.ProcessStartInfo.’System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′
ProcessDotNetSystem.Diagnostics.Process.’System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′
ProcessWindowStyleDotNetSystem.Diagnostics.ProcessWindowStyle.’System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089′

Needless to say, but I can simply call it like this:

PrintAnyDocument(‘c:\Temp\Any.pdf’);

And with the TryFunction (which I can safely use here because it’s a pure .Net try/catch), it gives me some nice errorcatching, like:


Enjoy!

A Merry Christmas and a “Cloudy” New Year!

$
0
0

WaldoXMas

I would like to wish you a Merry Christmas and all the best for next year! May the force be with you in your way to the cloud. :-)

And waldo wouldn’t be waldo if he didn’t celebrate it with PowerShell .. here is a script I found on PowerShell.com:-) .. Just paste in a PowerShell ISE and run it!

# inspired by: 
# http://powershell.com/cs/blogs/tips/archive/2015/12/14/time-for-christmas.aspx 
$notes = @'
  4A4 4A4 2A4 4A4 4A4 2A4 4A4 4C4 4F3 4G3 1A4 
  4Bb4 4Bb4 4Bb4 8Bb4 8Bb4 4Bb4 4A4 4A4 8A4 8A4 4A4 4G3 4G3 4A4 2G3 2C4 
  4A4 4A4 2A4 4A4 4A4 2A4 4A4 4C4 4F3 4G3 1A4 4Bb4 4Bb4 4Bb4 4Bb4 
  4Bb4 4A4 4A4 8A4 8A4 4C4 4C4 4Bb4 4G3 1F3 
  4C3 4A4 4G3 4F3 2C3 4C3 8C3 8C3  
  4C3 4A4 4G3 4F3 1D3 4D3 4Bb4 4A4 4G3 1E3 4C4 4C4 4Bb4 4G3 
  1A4 4C3 4A4 4G3 4F3 1C3 4C3 4A4 4G3 4F3 1D3 
  4D3 4Bb3 4A4 4G3 4C4 4C4 4C4 8C4 8C4 4D4 4C4 4Bb4 4G3 4F3 2C4 4A4 4A4 2A4 
  4A4 4A4 2A4 4A4 4C4 4C3 8G3 1A4 4Bb4 4Bb4 4Bb4 8Bb4 4Bb4 4A4 4A4 8A4 8A4 
  4A4 4G3 4G3 4A4 2G3 2C4 4A4 4A4 2A4 4A4 4A4 2A4 4A4 4C4 4F3 8G3 
  1A4 4Bb4 4Bb4 4Bb4 4Bb4 4Bb4 4A4 4A4 8A4 8A4 4C4 4C4 4Bb4 4G3 1F3'@ 
# Note is given by fn=f0 * (a)^n 
# a is the twelth root of 2 
# n is the number of half steps from f0, positive or negative 
# f0 used here is A4 at 440 Hz 
$StandardDuration = 1000
$f0 = 440
$a = [math]::pow(2,(1/12)) # Twelth root of 2
function Get-Frequency([string]$note)
{
  # n is the number of half steps from the fixed note
  $null = $note -match '([A-G#]{1,2})(\d+)'
  $octave = ([int] $matches[2]) - 4;
  $n = $octave * 12 + ( Get-HalfSteps $matches[1] );
  $f0 * [math]::Pow($a, $n);
}
function Get-HalfSteps([string]$note)
{
  switch($note)
  {'A'  { 0 }'A#' { 1 }'Bb' { 1 }'B'  { 2 }'C'  { 3 }'C#' { 4 }'Db' { 4 }'D'  { 5 }'D#' { 6 }'Eb' { 6 }'E'  { 7 }'F'  { 8 }'F#' { 9 }'Gb' { 9 }'G'  { 10 }'G#' { 11 }'Ab' { 11 }
  }
}
$notes.Split(' ') | ForEach-Object {
  if ($_ -match '(\d)(.+)')
  {
    $duration = $StandardDuration / ([int]$matches[1])
    $playNote = $matches[2]
    $freq = Get-Frequency $playNote
    [console]::Beep( $freq, $duration)
    Start-Sleep -Milliseconds 50
  }
}

I play piano myself, so I couldn’t help it to change it a little bit so it sounds a little bit better

Enjoy!

 

 

NAV Extensions – Generic Code To Migrate Data When Upgrading Extensions

$
0
0

This is not a post on how Extensions work – I guess there is a lot of material out there that already handles Extensions. And if you don’t find any .. just check YouTube for a few video’s, like:

This small post is about migrating data from one version of your extension to another.

When you’re upgrading an extension, you’re actually doing that in a few steps:

  1. Publish a new version of your extension
  2. Uninstall the current installed version of the extension in Tenant X
  3. Install the new published version of the extension in Tenant X

Now, uninstalling an extension means it’s going to kill any change you did on the application, which also means it’s going to remove fields, and even entire tables.

What happens with the data then?

Well, to not go too deep into details: your data is moved to a new table in the SQL Server Database, like you see here:

The $AppData$… table is holding all the data with the fields and values that you would have lost when uninstalling the Extension.

So how do I get the data in?

Well, it’s all quite well explained on MSDN: Extending Microsoft Dynamics NAV Using Extension Packages. In short, you have to create a codeunit that holds at least one of these global function names:

  • OnNavAppUpgradePerDatabase()
  • OnNavAppUpgradePerCompany()

Indeed: they need to exist in your extension (part of your delta’s) as global functions. Weird approach, isn’t it? But it works – and you can even guess how it works: it’s going to call the “..PerDatabase” only once per database (so, only once), and the other once in each company. That’s where you will put your upgrade code.

How do I get to the data in code?

To get to your saved data, you have the statement “NAVAPP.GETARCHIVERECORDREF”. You need to provide a tableID, and a RecRef variable. The latter will contain the archived data of the table you mentioned. So this needs to be done for all the tables you have touched in your Extension .. new and modified!

Now, one point of concern: This recref is NOT based on the original table, which you might have expected. So you will not be able to do something like DestinationRecRef := SourceRecRef , or DestinationRecRef.GET(SourceRecRef) … or anything in that sort. There is only the long way (as far as I know) .. the only thing you can do, is looping the fields, and transfer the data field-by-field back to where it belonged. You have to manage lost data, lost fields, conversions, … everything in code.

So, imagine, if you need to foresee upgrade code for every field .. you might want to do this as generic as possible – else, at some point in the future, you WILL forget to add upgrade code for a new field you added in a new version of the Extension .. .

Let me show you one way to do this generically .. not claiming it’s the only way, or the best way .. It’s just ‘a’ way

I will share the codeunit at the end of this blogpost. But let’s first look at the main function:

And this explains it all already :-). I have two functions and a few assumptions (I know .. they probably are mother of all f***ps.. but still ..).

When you design tables in an extension, you will either change a default table, of you will add new tables. So it’s these situations that are handled in the two functions. On top of that, I need to do a few assumptions:

  • The new fields in the modified tables, are in a specified range. In this case: 50000..99999. In this case, other fields will be skipped. Beware! The code is looping all the fields in default tables
  • For the new table, I need to loop ALL fields. In this case, I assume the new tables are in a specific number range. In this case, once again, 50000..99999.

Makes sense, but still, it’s assumptions, and you should be aware!

But taking this in account, it’s possible to just loop the Fields-table / Object table, and you’re good to go. With only one remark: don’t loop the Object table (that table isn’t aware of the Extension-objects), but loop the AllObj-table, like you see in the code below.

That’s about it. The rest is quite straight forward. Here is the codeunit – enjoy!

OBJECT Codeunit 69611 Rental Extension Upgrade Mgt.
{
  OBJECT-PROPERTIES
  {
    Date=07/01/16;
    Time=23:31:55;
    Modified=Yes;
    Version List=CloudReadySoftware;
  }
  PROPERTIES
  {
    OnRun=BEGIN
          END;

  }
  CODE
  {

    PROCEDURE OnNavAppUpgradePerDatabase@1100084000();
    BEGIN
    END;

    PROCEDURE OnNavAppUpgradePerCompany@1100084001();
    BEGIN
      RestoreFieldsInModifiedTables(50000,99999);
      RestoreAppTables(50000,99999);
    END;

    LOCAL PROCEDURE RestoreFieldsInModifiedTables@1100084005(FromField@1100084004 : Integer;ToField@1100084005 : Integer);
    VAR
      Field@1100084001 : Record 2000000041;
      AllObj@1100084000 : Record 2000000038;
      SourceRecRef@1100084002 : RecordRef;
      DestinationRecRef@1100084003 : RecordRef;
      KeyRef@1100084006 : KeyRef;
    BEGIN
      WITH AllObj DO BEGIN
        SETRANGE("Object Type","Object Type"::Table);

        IF FINDSET THEN
          REPEAT
              Field.SETRANGE(TableNo, "Object ID");
              Field.SETRANGE("No.", FromField, ToField);
              IF NOT Field.ISEMPTY THEN BEGIN
                IF NAVAPP.GETARCHIVERECORDREF("Object ID", SourceRecRef) THEN BEGIN
                  IF SourceRecRef.FINDSET THEN
                    REPEAT
                      DestinationRecRef.OPEN("Object ID",FALSE);
                      IF GetRecRefFromRecRef(SourceRecRef,DestinationRecRef) THEN BEGIN
                        TransferCustomFieldRefs(SourceRecRef,DestinationRecRef,FromField,ToField);
                        DestinationRecRef.MODIFY;
                      END ELSE BEGIN
                        ERROR('Destination record not found.  Data would be lost.  RecordID: %1',FORMAT(SourceRecRef.RECORDID));
                      END;
                      DestinationRecRef.CLOSE;
                    UNTIL SourceRecRef.NEXT < 1;
                 END;
              END;
          UNTIL NEXT < 1;
      END;
    END;

    LOCAL PROCEDURE RestoreAppTables@1100084006(FromTableID@1100084003 : Integer;ToTableID@1100084004 : Integer);
    VAR
      SourceRecRef@1100084002 : RecordRef;
      DestinationRecRef@1100084001 : RecordRef;
      Field@1100084000 : Record 2000000041;
      AllObj@1100084005 : Record 2000000038;
    BEGIN
      WITH AllObj DO BEGIN
        SETRANGE("Object Type", "Object Type"::Table);
        SETRANGE("Object ID", FromTableID,ToTableID);

        IF FINDSET THEN
          REPEAT
            IF NAVAPP.GETARCHIVERECORDREF("Object ID", SourceRecRef) THEN BEGIN
              IF SourceRecRef.FINDSET THEN
                REPEAT
                  DestinationRecRef.OPEN(DATABASE::Table69601,FALSE);

                  TransferFieldRefs(SourceRecRef,DestinationRecRef);
                  DestinationRecRef.INSERT;

                  DestinationRecRef.CLOSE;
                UNTIL SourceRecRef.NEXT = 0;
            END;
          UNTIL NEXT < 1;
      END;
    END;

    LOCAL PROCEDURE TransferFieldRefs@1100084004(VAR SourceRecRef@1100084000 : RecordRef;VAR DestinationRecRef@1100084001 : RecordRef);
    VAR
      Field@1100084002 : Record 2000000041;
      SourceFieldRef@1100084003 : FieldRef;
      DestinationFieldRef@1100084004 : FieldRef;
    BEGIN
      WITH Field DO BEGIN
        SETRANGE(TableNo, DestinationRecRef.NUMBER);
        IF FINDSET THEN
          REPEAT
            IF SourceRecRef.FIELDEXIST(Field."No.") THEN BEGIN
              SourceFieldRef := SourceRecRef.FIELD(Field."No.");
              DestinationFieldRef := DestinationRecRef.FIELD(Field."No.");
              DestinationFieldRef.VALUE := SourceFieldRef.VALUE;
            END;
          UNTIL NEXT = 0;
      END;
    END;

    LOCAL PROCEDURE TransferCustomFieldRefs@1100084015(VAR SourceRecRef@1100084000 : RecordRef;VAR DestinationRecRef@1100084001 : RecordRef;FromField@1100084005 : Integer;ToField@1100084006 : Integer);
    VAR
      Field@1100084002 : Record 2000000041;
      SourceFieldRef@1100084003 : FieldRef;
      DestinationFieldRef@1100084004 : FieldRef;
    BEGIN
      WITH Field DO BEGIN
        SETRANGE(TableNo, DestinationRecRef.NUMBER);
        SETRANGE("No.",FromField,ToField);
        IF FINDSET THEN
          REPEAT
            IF SourceRecRef.FIELDEXIST(Field."No.") THEN BEGIN
              SourceFieldRef := SourceRecRef.FIELD(Field."No.");
              DestinationFieldRef := DestinationRecRef.FIELD(Field."No.");
              DestinationFieldRef.VALUE := SourceFieldRef.VALUE;
            END;
          UNTIL NEXT = 0;
      END;
    END;

    LOCAL PROCEDURE GetRecRefFromRecRef@1100084007(VAR SourceRecRef@1100084001 : RecordRef;VAR DestinationRecRef@1100084000 : RecordRef) : Boolean;
    VAR
      KeyRef@1100084002 : KeyRef;
      i@1100084003 : Integer;
      FieldRef@1100084004 : FieldRef;
      SourceFieldRef@1100084005 : FieldRef;
      DestinationFieldRef@1100084006 : FieldRef;
    BEGIN
      KeyRef := DestinationRecRef.KEYINDEX(1);
      FOR i := 1 TO KeyRef.FIELDCOUNT DO BEGIN
        FieldRef := KeyRef.FIELDINDEX(i);

        SourceFieldRef := SourceRecRef.FIELD(FieldRef.NUMBER);
        DestinationFieldRef := DestinationRecRef.FIELD(FieldRef.NUMBER);

        DestinationFieldRef.VALUE := SourceFieldRef.VALUE;
      END;

      IF NOT DestinationRecRef.FIND('=') THEN
        EXIT(FALSE)
      ELSE
        EXIT(TRUE);
    END;

    BEGIN
    END.
  }
}

 

One big disclaimer though .. I did NOT test this code thoroughly. But I did test it somewhat ;-). If there are issues, you’re always welcome to comment, and I’ll try to look at it!

Cloud SureStep for Product Development: The PowerShots

$
0
0

PowerShots2015That’s right, there are new PowerShots in town!

You might have read about the previous PowerShot series on my blog, where we focused on “The Transition of your Product to the Cloud”. “We”, being the “Cloud Ready Software” dudes: Gary, Vjeko and me.

You might wonder – why would I go this year if I already went last year?

Well .. 80% of the content is brand-new, and the rest provides updated insights into procedures and technologies which have evolved with the release of Dynamics NAV 2016. You might have expected, now we have things like “events” and “extensions”, right?

On top of that, Cloud Ready Software has been invited by Microsoft to analyze and document the Microsoft way of doing repeatable business in all of its aspects: strategy, tooling, company setup, processes and code architecture. The 2016 PowerShots give you a head start on this Cloud SureStep for Product Development methodology:

  • Agile development and queue management will help you leverage your resources more efficiently.
  • Eventing and NAV Extensions open up a whole set of new opportunities for developing solutions.
  • Combine source code management, brand-new design patterns and PowerShell to enable continuous monthly updates.
  • Integrate all components of the Microsoft Business Cloud to build powerful and compelling solutions.
  • Learn how Dynamics NAV Managed Services take the pain out of running and distributing cloud software.

Cool, where do I register?

We are doing these PowerShots on numerous locations throughout the world. We actually already did France, and here are the dates of the other countries with the corresponding registration links:

FranceDec., 17th / 18thdone
SwitzerlandJan., 28th / 29thRegistration
GermanyFeb., 11th / 12thRegistration
NorwayFeb., 18th / 19thRegistration
Belgium & NetherlandsFeb., 25th / 26thRegistration
SpainMar., 10th / 11thRegistration
DenmarkApr., 7th / 8thRegistration
SwedenApr., 21st / 22ndRegistration
ItalyMay, tbdN/A
UKMay, 9th / 10thRegistration
IrelandMay, 12th / 13thRegistration
CEEMay, 19th / 20thRegistration
North AmericaJune: 13th/14th; 16th/17th; 20th/ 21stN/A

So .. All details you need to know for you to start registering in your country :-).

See you at the PowerShots!

NAV Extensions – Updated Generic Data Migration Options

$
0
0

OK, it’s official. I feel like an idiot :-).

I was so proud with my previous blogpost as I was able to solve a somewhat complicated problem generically. And now there appears to be a platform solution for what I was trying to solve (or at least for a big part of it).

I remember to have provided feedback to Microsoft on exactly this topic: restore data generically with one statement. Not saying it’s because of me that it’s in the product, only saying it makes a lot of sense that there is a platform statement to just “restore what you can restore“.. .

New NAVAPP Statements

Look at this:

I’m very sure the NAVAPP.DELETEARCHIVEDATA and NAVAPP.RESTOREARCHIVEDATA (we’ll focus on this one) were added after the RTM release. In fact, I took the time to find out the following:

  • NAV2016 CU1 did not contain these statements
  • NAV2016 CU2 DID contain these statements – so from this update, you can benefit from the new statements
  • NAV2016 CU3 .. Duh ..

Though, the description of the Cumulative Update did not mention one single thing about this. This is confusing, Microsoft! :-/. There isn’t even one MSDN page that explains this, nor the help that comes with the DVD is updated. I’m guessing this is pretty new ;-).

Anyway .. as said, there are two very simple new statements:

  • NAVAPP.DELETEARCHIVEDATA(TableID);
  • NAVAPP.RESTOREARCHIVEDATA(TableID);

And there is not much explanation needed, I guess. With one statement, you can restore or delete the data that is in the archive. This is obviously only useful when you know what you’re doing, and you always want to completely restore all Extension-fields from that specific table.

Then again, it might make you wonder how it behaves in certain circumstances.

  1. Well, the RESTOREARCHIVEDATA statement requires that all fields existing in the archive table, also still exist in the new version of the extension destination table. If there is one field missing (if you deleted a field in the new version of your extension), it will end up in an error:

Which means in these cases, you won’t be able to use anything generic, but do it the hard way: write your own upgrade logic. You will get this kind of error message:

  1. Another case is: what if there is a New field in the new version of the extension, that doesn’t exist in the archive table. Well, there is no problem in using the RESTOREARCHIVEDATA statement, as long as the new fields are fine using a default value.
  2. Last but not least: when a record was removed after uninstalling the extension, would mean there is a record in the archive table that can’t be found when restoring the archived data. Well, in that case, it’s just going to skip the record during restore.

I tested these scenario’s and it actually really works very well. I’m sure you can come up with other scenario’s .. please don’t hold back to test them, and report back in the comment section of this blog ;-).

You still need to loop through some tables
Obviously, you being able to use a generic codeunit that will always work, really depends on quite some assumptions. But let’s try to rework the codeunit in my previous blogpost, with the new statements:

OBJECT Codeunit 69611 Rental Extension Upgrade Mgt.
{
  OBJECT-PROPERTIES
  {
    Date=19/01/16;
    Time=23:10:40;
    Modified=Yes;
    Version List=RA1.00;
  }
  PROPERTIES
  {
    OnRun=BEGIN
          END;

  }
  CODE
  {

    PROCEDURE OnNavAppUpgradePerDatabase@1100084000();
    BEGIN
    END;

    PROCEDURE OnNavAppUpgradePerCompany@1100084001();
    BEGIN
      RestoreFieldsInModifiedTables(50000,99999);
      RestoreAppTables(50000,99999);
    END;

    LOCAL PROCEDURE RestoreFieldsInModifiedTables@1100084005(FromField@1100084004 : Integer;ToField@1100084005 : Integer);
    VAR
      Field@1100084001 : Record 2000000041;
      AllObj@1100084000 : Record 2000000038;
      SourceRecRef@1100084002 : RecordRef;
      DestinationRecRef@1100084003 : RecordRef;
      KeyRef@1100084006 : KeyRef;
    BEGIN
      WITH AllObj DO BEGIN
        SETRANGE("Object Type","Object Type"::Table);

        IF FINDSET THEN
          REPEAT
              Field.SETRANGE(TableNo, "Object ID");
              Field.SETRANGE("No.", FromField, ToField);
              IF NOT Field.ISEMPTY THEN BEGIN
                NAVAPP.RESTOREARCHIVEDATA("Object ID");
              END;
          UNTIL NEXT < 1;
      END;
    END;

    LOCAL PROCEDURE RestoreAppTables@1100084006(FromTableID@1100084003 : Integer;ToTableID@1100084004 : Integer);
    VAR
      SourceRecRef@1100084002 : RecordRef;
      DestinationRecRef@1100084001 : RecordRef;
      Field@1100084000 : Record 2000000041;
      AllObj@1100084005 : Record 2000000038;
    BEGIN
      WITH AllObj DO BEGIN
        SETRANGE("Object Type", "Object Type"::Table);
        SETRANGE("Object ID", FromTableID,ToTableID);

        IF FINDSET THEN
          REPEAT
            NAVAPP.RESTOREARCHIVEDATA("Object ID");
          UNTIL NEXT < 1;
      END;
    END;

    BEGIN
    END.
  }
}

As you can see, it’s a lot shorter – just because of the fact that there is a lot less code needed:

  • No juggling around with RecRef, FieldRef, Keys, …
  • No looping of fields and transferring data one-by-one

As a matter of fact, there are only the two fuctions that are going to figure out in which default tables you have added fields .. and which new tables you have created – the same way: by providing a object range.

Hope you’ll enjoy!

How Do I Video’s on PowerShell and Microsoft Dynamics NAV

$
0
0

There are a bunch of video’s out there already, thanks to the “HDI” (How Do I) initiative from Microsoft. In a LOT of these video’s, you see PowerShell popping up. I was interested in which ones, because, it gives you an idea in which areas PowerShell is useful. And it seems PowerShell is useful in A LOT of areas concerning Dynamics NAV ;-).

Here is the list:

How Do I: Migrate from Multiple Companies to a Multi-tenant Architecture in Dynamics NAV 2013 R2http://youtu.be/thQFJGTYA0E
How Do I: Migrate a single-tenant NAV 2013 database to NAV 2013 R2 with a multi-tenancy architecturehttp://youtu.be/EqWtrNoMTMM
How Do I: Backup and Restore in a Multitenant Environment in Microsoft Dynamics NAV 2013 R2http://youtu.be/v4zZdzU9fa0
How Do I: Set Up and Monitor Database Synchronization in Microsoft Dynamics NAV 2013 R2http://youtu.be/NP3ZdNYRilo
How Do I: Manage Tenants in Microsoft Dynamics NAV 2013 R2http://youtu.be/jk6nY44JrD4
How Do I: Run an Automated Data Upgrade in Microsoft Dynamics NAV 2013 R2http://youtu.be/cuYV4FO97U0
How Do I: Manage Users in PowerShell with Microsoft Dynamics NAV 2013 R2http://youtu.be/1Yb3UtS7z8A
How Do I: Get started with PowerShell for Microsoft Dynamics NAV 2013 R2http://youtu.be/MnWKPdkGFSU
How Do I: Create PowerShell scripts for Microsoft Dynamics NAV 2013 R2http://youtu.be/_pH0pzjjx5w
How Do I: Setup Web Client, Windows Client, NAS and Web Services on separate service tiers in Microsoft Dynamics NAV 2013 R2http://youtu.be/iVxYCQZa1_4
How Do I: Enable and Verify Single Sign-on with Office 365 in Microsoft Dynamics NAV 2013 R2 using Best Practice Analyzer and Windows PowerShellhttp://youtu.be/X_GSTLG6t9E
How Do I: Export and Import of data in the Microsoft Dynamics NAV R2 databasehttp://www.youtube.com/watch?v=5a1gtFIfyQM&feature=youtu.be
How Do I: Use PowerShell and Windows Azure Active Directory allowing single sign on for Microsoft Dynamics NAV 2015 Windows Clienthttp://youtu.be/4rl1rxOnIaw
How Do I: Set up Office 365 Single Sign On in Microsoft Dynamics NAV 2015 using the Set-NavSingleSignOnWithOffice365 cmdlethttps://www.youtube.com/watch?v=9gOgGdYb1Ms
How Do I Compare Microsoft Dynamics NAV Application Objects Using Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=05b8zrGpYJ8&index=1&list=PL5B63EF419A3B59C8
How Do I Compile using the Development Shell in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=2_nP6lroh0Q&list=PL5B63EF419A3B59C8&index=19
How Do I Get Started with Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=c6MLWVGXQOY&list=PL5B63EF419A3B59C8&index=16
How Do I Handle Captions and Translation Files Using Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=JNDcMuRHy-c&index=5&list=PL5B63EF419A3B59C8
How Do I Handle Documentation Trigger Conflicts Using Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=lryYApNkWC8&index=6&list=PL5B63EF419A3B59C8
How Do I Merge Version Lists Using the Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=xnfB_eewCKs&list=PL5B63EF419A3B59C8&index=14
How Do I Prepare and Run Upgrade Toolkit Step 1 in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=oopKf_QI2X0&index=13&list=PL5B63EF419A3B59C8
How Do I Prepare and Run Upgrade Toolkit Step 2 in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=okhm8bB4vps&index=9&list=PL5B63EF419A3B59C8
How Do I Update Microsoft Dynamics NAV Application Objects Using Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=UqLq0hByEYY&index=8&list=PL5B63EF419A3B59C8
How Do I Use PowerShell and Windows Azure Active Directory allowing Single Sign-on for Microsoft NAV 2015 Windows Clienthttps://www.youtube.com/watch?v=4rl1rxOnIaw
How Do I Use the Application Merge Utilities in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=wCzyTtdtnkk&list=PL5B63EF419A3B59C8&index=22
How Do I Compare and Merge Objects Using PowerShell in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=4nD7yMwO76M&list=PL5B63EF419A3B59C8&index=6
How Do I: Deploy Microsoft Dynamics NAV 2015 on One Microsoft Azure VMhttps://www.youtube.com/watch?v=CNL-O5X2E8Q&list=PL5B63EF419A3B59C8&index=2
How Do I: Deploy Microsoft Dynamics NAV 2015 on Two Microsoft Azure VMshttps://www.youtube.com/watch?v=0OgyxlQ6THU&list=PL5B63EF419A3B59C8&index=3
How Do I: Simulate Users When Load Testing in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=Ho4w8S_VDQE&list=PL5B63EF419A3B59C8&index=1
How Do I: Add a New Instance to Existing Microsoft Dynamics NAV on Microsoft Azurehttps://www.youtube.com/watch?v=KNIZOMSXmo4&list=PL5B63EF419A3B59C8&index=2
How Do I: Load Test Multiple Tenants in Microsoft Dynamics NAV 2015https://www.youtube.com/watch?v=BNtTU4YMlQI&index=6&list=PL5B63EF419A3B59C8
How Do I: Install and Use the Bing Maps Demo Package on my Azure VM with Microsoft Dynamics NAVhttps://www.youtube.com/watch?v=jycj_4O7di8&index=5&list=PL5B63EF419A3B59C8
How Do I: Install and Use the ClickOnce Demo Package on my Azure VM with Microsoft Dynamics NAVhttps://www.youtube.com/watch?v=CLwfvrLK2kM&index=4&list=PL5B63EF419A3B59C8
How Do I: Install and Use the PowerBI Demo Package on my Azure VM with Microsoft Dynamics NAVhttps://www.youtube.com/watch?v=zfVIn_FHUZ4&index=3&list=PL5B63EF419A3B59C8
How Do I: Install and Use the Optional Packages on My Azure VM with Microsoft Dynamics NAVhttps://www.youtube.com/watch?v=a6sVE7fYACY&index=2&list=PL5B63EF419A3B59C8
How Do I: Install and Use the Multitenancy Demo Package on My Azure VM with Microsoft Dynamics NAVhttps://www.youtube.com/watch?v=41z8lCgSphU&index=1&list=PL5B63EF419A3B59C8
How Do I: Deploy Microsoft Dynamics NAV 2015 License and Languages on Microsoft Azurehttps://www.youtube.com/watch?v=KMrHjjpZQps&list=PL5B63EF419A3B59C8&index=42
How Do I: Automate Daily Test Execution in Microsoft Dynamics NAV 2016https://www.youtube.com/watch?v=uK49lfSkuus&list=PL5B63EF419A3B59C8&index=34
How Do I: Create an Extension Package in Microsoft Dynamics NAV 2016https://www.youtube.com/watch?v=4g0jl_UWX9E&list=PL5B63EF419A3B59C8&index=31
How Do I: Deploy and Manage Extensions in Microsoft Dynamics NAV 2016https://www.youtube.com/watch?v=tbfXxBJA3Fs&list=PL5B63EF419A3B59C8&index=27
How Do I: Use Microsoft Dynamics NAV Management Portal Web Services with Microsoft Windows Power Shellhttps://www.youtube.com/watch?v=IYIfeuEsLqg&list=PL5B63EF419A3B59C8&index=1
How Do I: Deployment Microsoft Dynamics NAV 2016 Phone Client with Self-signed Certificatehttps://youtu.be/i6Esylffa54
How Do I: Configuration of SQL Authentication in Microsoft Dynamics NAV 2016https://www.youtube.com/watch?v=jQNIBJnQLHQ&feature=youtu.be

Enjoy!

NAV 2016 Extension Development Shell on the Azure Demo VM

$
0
0

Now, that’s a long title, isn’t it? :-)

I was in a conversation with Freddy Kristiansen .. the so called “father” of the “Azure Demo VM Image for NAV 2016” (or how do you kids call it these days). If you haven’t heard about that, I strongly advise you to check out for example MVP Steven Renders’ blog on it: http://thinkaboutit.be/2015/10/dynamics-nav-vm-on-microsoft-azure/ .

The VM is really useful for many reasons, like:

  • To showcase NAV 2016 in full glory
    • With Add Ins
    • Integrated with CRM
    • Integrated with Office 365
  • To get familiar with
    • NAV and its functionalities
    • Extensions and how to work with it

And much more ..

But during this conversation, Freddy pointed my attention to the latter in the list: Getting familiar with Extensions by using the Azure Demo VM.

The Extensions Development Shell

Part of the NAVDEMO folder, there is the “Extensions Development Shell”:

When you install it, you end up with an extra icon on your desktop:

which basically is making available a PowerShell environment for you, where it’s easier for you to work with Extensions .. and it’s a very good way to start getting familiar with Extensions for sure.

It has some new functions, based on the building blocks that are already part of the product.

Here is a video of Freddy where he explains how it’s installed and used:

Set your expectations

This shell and these scripts will work beautifully on the Azure VM, but don’t expect them to work on your own environment. Copying the scripts won’t cut it. They are all quite dependent on the Azure VM and all the stuff (file structure, product install files, …) that have been provided on it. It’s quite likely you won’t set up your environment just like it. The scripts are not intended to be a full blown development environment shell for extensions anyway – just to get you familiar with it.

But don’t think they are useless for people that already are familiar with Extensions because of that .. far from .. because another thing is that this is VERY useful for testing your extensions on multiple “languages” or “localizations”, as on this image, all the DVD’s for all localizations are present. As such, it’s just a matter of copy you navx file to the VM, and do something like this in the “Extensions Development Shell”:

New-Devinstance DK –DevInstance TESTDK
Publish-NavApp TESTDK –Path "xxx.navx"
Install-NavApp TESTDK –Path "xxx.navx"
Start-WinClient TESTDK

This is a very useful feature that comes with it, if you ask me!

Enjoy!


NAV 2016: Hooks or Events?

$
0
0

100815_1359_MicrosoftDy1.pngYou don’t get the chance to blog on the 29th of February too much. In fact, you only get one chance in 1461 days .. . So let’s take this opportunity, to bother you with something that I have been thinking about from the moment Microsoft Dynamics NAV 2016 was release:

What do we do .. Keep using hooks? Or use events instead?

Well, one thing for sure: Microsoft Dynamics NAV Events can’t fully replace hooks, just because of the fact that there aren’t enough of them .. or they are not raised on the places we need it to be raised. To give you an easy example: On Sales Line, OnValidate of the field “No.”, there is a lot of checking going on (lots of TESTFIELDS) in the beginning of the trigger. Well, it makes also sense to “hook in” as an OnBeforeValidate – but AFTER these checks. Which actually means: the default (in this case) trigger event can’t be used. Therefore, we’re still using hooks, like you see here:

The “OnBefore” is not really OnBefore, but actually “OnJustAfterAllTheChecks…”

In my opinion, we will always have these kinds of hooks – it will be difficult for Microsoft to foresee all possible places partners would want to hook into anyway. Not any code design or any pattern will foresee every possible hook. Then again .. applying some patterns will sure make up for lots of them ;-).

But anyway …

So, what is it then .. Events or Hooks?

Well .. let’s talk product development for a moment in terms of “execution order of events”. When you’re creating a product, you usually are creating processes, workflows, .. stuff that somewhat have a connection with each other .. although you’re as generic as possible  .. but still, it makes sense that you are in total control over the execution order of about everything in regards to your business logic.

Well .. if you want to be in total control over your execution order .. basically .. you should make sure that you don’t subscribe to the same event too often. Well, how we facilitate this, is basically by not changing too much on our hook-design-pattern. It’s this pattern that actually make sure that we only hook in (or let’s call it “subscribe”) to a certain place within Microsoft-objects only once! And within these hook functions, we can still put our product-business logic. Hook functions would mean: a line of code into default objects (like shown in the picture above). To avoid this, we could basically use hooks in combination with events: subscribe the hook-functions if possible (if the existing events actually make sense).

So, in short – in my opinion – it’s not one or the other – but a combination of each other: why not doing events if possible in combination with hooks. Like:

– we declare a Hook codeunit

– we create an On… subscriber that subscribes to the event.

– all code goes still in this hook (calling out to methods and such…), which means: total control of execution order.

So, subscribing multiple times to one event is evil?

Most certainly not! In fact, there are design patterns that only make sense when using multiple describers .. or let’s say that are intended for it (I will post one soon .. it’s under review as we speak). But it’s just something that you need to be very conscious of. Every single multiple subscriber should be a very conscious decision.

Eventing Helper

Therefore, we have been implementing something what we call “an Event Helper”. A very simple functionality that is running “OnCompanyOpen” (no, in this case, we’re not using an event subscriber … checking events by using events just doesn’t make too much sense in my opinion – in this case, it’s “just” a hook  .. but anyway.. that’s somehow besides the point). Basically, what the eventing helper does is:

  • It gives us a better, more meaningful overview of all subscribers, which are subscribed more than once, errors, .. .
  • Generates warnings and errors (depending on type of database (LIVE, TEST, DEV, …)) in case when subscribers contain errors (like mismatching parameters)

As you can see: very simple .. everyone can create this based on the “Event Subscription” table. Here’s a screenshot:

(yes, you can drill down on the “No. of Subscribers” to have a look at the subscribers)

One very important “Gotcha”

When working with hooks as being “subscribers to events”, means that your hook will always be executed as a new instance. You hook isn’t global anymore, so not part of the running instance of (for example) codeunit 80 anymore .. . This is something you need to manage. If your hooks have any globals or kept state – it will be lost! Now, you shouldn’t be dependent of globals in any case .. but fact is, in default NAV, the way it has been written in many cases (usually in the “old” cases), you are just very dependent of the instance of the object, the current context .. and you need this context to be preserved until you access the next part of your hook. Well, in this case, you might not want to do it with events. In this case, hooks are still the only way to go.

Conclusion

Spiderman’s uncle was a wise man. It was him that said: “With great power comes great responsibility“. And it makes sense for developers, doesn’t it? But I must be honest .. I don’t believe that we need to be more responsible now that we have eventing. Before, we could break the business logic just as easily as we can do today. There are just new options to break it :-).

But I love this new option. These events are powerful. And the more Microsoft will restructure their code (let’s assume they will), the more the default events will make sense. I even hope that at a certain time, we will have trigger-events on codeunit functions. This would make a lot of sense :-).

But until then, in general .. I keep on creating hooks, subscribe them whenever I can and keep my execution order under full control. I don’t know whether this is a new pattern or an extension on the Hook Pattern .. but does it matter ;-)?

In the meanwhile, all there is to say about it is: this is just my personal solution based on my personal opinion, which I shared in my personal blog  – basically meaning that I don’t want to force anything upon you (how is that as a disclaimer :-)).

Discovery Event Pattern

$
0
0

I have been working on describing this very interesting NAV Design Pattern that uses events. We have been implementing it a few times, and I just didn’t want to hold it from you. You can find the pattern on “The Microsoft Dynamics NAV Design Patterns Wiki“. The actual design pattern can be found here, but just for googling reasons, I’m repeating it below :-).

Enjoy!

Abstract

The “Discovery Event” pattern is a way for a generic functionality, to call out to other functionalities that want to make use of it, by raising an event, so that they have an event to subscribe to. This is usually done to set itself up within the generic app.

The problem

Let’s suppose you have a generic piece of functionality, that hooks into lots of places (modules) in your application. To set this up, you might have to hook into all these parts of the application. Well, this pattern turns this setup around: let all the different modules set itself up in the generic app by raising a “discovery event”.

Usage

The pattern is most easily described when you look at an example.  This example is an actual usage of the pattern within the application, in page Service Connections.

The goal of this functionality is to:

  • List all the different connections to external services,
  • Have a central place to navigate to the corresponding setup of the service.

The functionality (Service Connections) itself, is not aware of the state nor setup nor any context of all the different services in the list.  All it does is:

  • It raises an event as an opportunity for all services within the NAV application to subscribe to,
  • It has a public function InsertServiceConnection that the subscribers can use to register itself at the Service Connection.

The event OnRegisterServiceConnection is raised when the page (1279 – Service Connections) is opened.

One example of a subscription is the SMTP setup.  In Codeunit 400 you’ll find the subscriber function HandleSMTPRegisterServiceConnection which subscribes to this discovery event, and calls the InsertServiceConnection to register itself.

Description

The main idea of this pattern is: “Discover the settings, the context, the records, … which I need for my functionality” or “Discover the configuration for my functionality”.  In any case, “discover” is the main idea.  It’s a pattern where using both publishers and subscribers in one application makes a lot of sense.

Let’s break down to the steps that are needed to implement the pattern.

Step 1: Publish the event

In the below example, I create a table Module Status with a published event OnDiscoverModuleStatuses.

You see that I also include the sender.  This way, I will be able to access the methods on my table (which I use as a class). Obviously, other patterns can be applied here as well, like the Argument Table pattern.

Step 2: Raise the event on the right place

When you publish an event, it should obviously be raised somewhere in the code as well.  In the below example, I want to raise the event simply by a method which I want to call from a page.  So I create a global function where I raise the event:

Step 3: Create one or more global functions, so that your subscriber can call into your functionality to configure, set up, or do whatever it needs to do to make itself discoverable

The generic functionality that I want to call, should be part of the main class – in this case the Module Discovery class, or better, the table (Module Status).  In this table, I create this global function, because I want to make it available for the subscribers:

The business logic doesn’t really matter for this pattern.  This is obviously dependent on the functionality where you would like to implement the pattern.

Step 4: subscribe from the places in the app to this event, use the global function(s)

This could be anywhere. Any module within your vertical, of within the main application, can subscribe to the event. In the example below, I create the subscriber in Codeunit80, as I was interested in the status of the Sales-module in default NAV.

The exact place of the subscriber is up to you.  The main message is that it’s part of the module that wants to subscribe, and not part of the Module Status module in the application.

Here is the subscriber (and one small helper function):

You see I can use the “sender” as a normal Record-variable.  I access the previously created global function to “register” this sales-module.

Microsoft Dynamics NAV Versions

This pattern only works with Microsoft Dynamics NAV 2016 and up.

Remove customizations with (Reverse) Deltas

$
0
0

I’m going to explain you a concept that is so simple that you’re probably going to react like: “dude .. come on .. you just totally wasted my time .. this is so obvious.. can’t believe you just don’t take this for granted”.

But I’m still gonna. Just because of the fact that the concept is so simple, that at first, I just didn’t think of it to use it like this.

Deltas

You know all about Deltas, right? It’s the type of file you can create with PowerShell (Compare-NAVApplicationObject) that describes the changes you did on specific objects. You can apply this file to an object file, also with PowerShell (Update-NAVApplicationObject), which basically merges all these described changes to the object file. Good stuff .. especially in upgrade scenarios.

It’s a way to transport changes and easily merge changes in a database by:

  • Exporting the objects to text from that database
  • Apply with PowerShell
  • Solve conflicts if any
  • Import the objects
  • Compile

What if I want to remove the changes back from the database?

Well .. Deltas describe changes. So it also describes the removal of any kind of control, property, object, .. . Also THAT is a Delta. As such, if you apply a removal of a field to a table .. your result will obviously be a text file of the table object, without that field. Furthermore, if your delta describes the removal of an object .. the result folder will not contain that object, and the “updateresult” will contain the list of objects that were removed. You can work with that!

But how do I create such a delta?

Well, also this is obvious: you create a delta by comparing two versions, right? The ORIGINAL and the MODIFIED. And you end up with a description of what you did in the modified database. Well, you can easily create a so called “reverse delta” (probably there is a better word for it) by switching the two: as original, you take the modified, and as the modified, you provide the original. You will end up with the exact opposite delta. And in my opinion, you should always create the two deltas: the modification, and the one to reverse the modification are equally important.

I used this in the field

A while back, we were implementing an add-on, and realized that the add-on just didn’t cut it for us. So basically, after quite some time, we had to remove the add-on (+300 changed objects) out of our database. This turned out to be very simple: I used that “reversed delta”, and removed it without any pain, including text constants, fields, local variables, undocumented code, .. and all other pain anyone faces when doing this kind of removal jobs manually.

So how do you handle the deletion of Objects

As you know, when you import a text file, NAV is never going to remove objects. This needs to be handled by either you manually, or your script. You can do it easily by looping the Updateresult-object you get back from the Update-NAVApplicationObject cmdlet, in combination with the cmdlet to remove objects: Delete-NAVApplicationObject (which doesn’t seem to be on MSDN). I added this to functions – which I will talk about in one of the next blogposts where I will explain some functions I added to my repository on GitHub to work with Deltas more easily.

Export NAV Application Objects

$
0
0

When working with NAV, you’re familiar of working with objects. And nowadays, we have quite a lot of options, like:

Depending on the situation, you might be interested in one of these kinds of exports – if not all.

Now, I have been working on some small pieces of functionalities (let’s call them “Extensions” ;-)) in NAV, like the Rental App that was shown at Directions and my very own new NAV2016 version of “WaldoNAVPad” (which is still under development, but can be obtained from GitHub here).

Now, doing these kinds of extensions, I wanted to move them to new development environments, new versions, localization, .. . So as such, it’s not really a collection of objects, but more like a collection of deltas. So I decided that my structure on my GitHub, should always contain all types of object files, in all formats that might be useful to me in the future. Does it make sense? Sure I know you can generate delta’s from textfiles (IF you have the original) – or the other way around. But I don’t really care. If I can generate it on the fly .. then I will, and just save it all. In this case, I always have everything at hand if I need it. Simple as that.

Backup-NAVApplicationObjects

I have dedicated myself in making it myself as easy as possible. All I do to save my developments to GitHub, is running a very simple PowerShell script, which you see here:

$Name = 'WaldoNAVPad'
$DEVInstance = 'WaldoNAVPad_DEV'
$ORIGInstance = 'DynamicsNAV90'
$WorkingFolder = 'C:\_Workingfolder\WaldoNAVPad'
$BackupPath = 'C:\Users\Administrator\Dropbox\GitHub\Waldo.NAV\WaldoNAVPad'
$CreatedITems = Backup-NAVApplicationObjects `
                    -BackupOption OnlyModified `
                    -ServerInstance $DEVInstance `
                    -BackupPath $BackupPath `
                    -Name $Name `
                    -NavAppOriginalServerInstance $ORIGInstance `
                    -NavAppWorkingFolder $WorkingFolder 
start $BackupPath

As you can see – only one function that gets all parameters. And the result is a fixed folder structure on my gitHub:

So, one change in one line of code, probably is going to end up in 5 files.

Is this ideal? Probably not. One line of code should be one change in your SCM. But do I care? Well .. I do, but I don’t think it is that big of a deal: my script will keep everything in tact: all 5 files will always be changed and be up-to-date .. and the benefit from having all kinds of files available in any situation is definitely making up for it :-).

I’m interested – tell me how it works

The function is available in my PowerShell Module on GitHub, so I strongly recommend you to install the module for you to be able to use it. But let me shortly tell you how it works. The functions are quite long, so I won’t post it here – you can have a look at the up-to-date version here.

Basically, the function will:

  • Export objects in fob and txt from the database, that is attached to the provided ServerInstance
  • Manage only the modified objects (this is how I do it)
  • IF you provided an Original (NavAppOriginalServerInstance), it will start to create delta’s, with the function “Create-NAVDelta
  • The Create-NAVDelta, is a story on its own. This function is going to use a workingfolder, because it will:
    • Export all objects from original and split them
      • If already done so: it’s not going to do it again, and it relies on the workingfolder
    • Export all objects from modified and split them
      • If already done so, it will not do that again, and only going to export the modified, split them to the split folder – modified is up-to-date
    • Create delta’s
    • Create reverse delta’s if necessary
  • The Backup-script will copy all the necessary files to the backup-folder (In my case, the GitHub).

The moment I run the Backup-script above, my GitHub gets updated, and I can beautifully see what I changed in my small application. Just have a look at this example, where I fixed a simple bug.

Disclaimers

This post is not supposed to form any opinion on Source Control Management. It just mentions how I “source control” my developments in my free time.. . The focus of the post, is the script that is simply going to produce all types of exports there are.

The links to my GitHub might change, as I restructure occasionally. In that case, leave me a comment and I will update this blog, and/or just browse to my main page, and start from there: https://github.com/waldo1001/. But again, I’d recommend to just simply install the module.

Apply-NAVDelta to add and remove customizations

$
0
0

031016_1357_ExportNAVAp1.gifYep, PowerShell again. So the people that deny PowerShell can stop reading now :-).

Although I do think this is quite an interesting blogpost on how to work somewhat more easily with deltas, by the use of a function that is part of my PowerShell Module, called: Apply-NAVDelta.

I was actually just thinking after my two previous posts, where I talked about “reverse deltas” and “Export NAV Application objects“: why not making the “update-delta” part much easier. Even more: I want to be able to import deltas into a database (of a ServerInstance) with one command. Not more!

That’s where Apply-NAVDelta comes into place

Let me start how an install-script could look like, if there were only deltas in the house (this is an example of my WaldoNAVPad):

$AppName = 'WaldoNAVPad'
$WorkingFolder = "C:\_Workingfolder\$($AppName)"
$ServerInstance = 'WaldoNAVPad_DEV'

Apply-NAVDelta -DeltaPath 'C:\Users\Administrator\Dropbox\GitHub\Waldo.NAV\WaldoNAVPad\AppFiles\*.DELTA' `
    -TargetServerInstance $ServerInstance  `
    -Workingfolder $WorkingFolder `
    -OpenWorkingfolder `
    -SynchronizeSchemaChanges Force `
    -DeltaType Add `
    -VersionList $AppName

Start-NAVIdeClient -Database $ServerInstance

As you can see .. simply call the Apply-NAVDelta function, and provide:

  • The Deltafiles that you want to apply
  • The TargetServerInstance you want to apply it to
  • What type of “apply” (are you adding it, or removing it?)
  • What the VersionList should be
  • How to synch schema changes
  • Open the workingfolder when done

You can see the last version of Apply-NAVDelta here.

In short, these are the steps:

  • Prepare workingfolder. I’m going to export/apply deltas which basically means: working with textfiles quite a lot. I need a folder to do so, so this is a very necessary step.
  • Figure out which objects I need to export by reading the deltas. I use a new function Get-NAVApplicationObjectPropertyFromDelta for this, that is basically going to read a delta, and figure out which object we’re talking about, and returns an object that can be worked with further in the script. Thanks Vjeko, to help me with the RegEx again :-).
  • Export these objects one-by-one and remove the empty files (when object doesn’t exist)
  • Apply deltas to these text files by the default “Update-NAVApplicationObject” CmdLet.
  • Update or remove versionlist: obviously we don’t forget to work with the VersionList, as this is never decently managed by any merge-cmdlet.
  • Update date/time to now
  • Manage the Modified flag: if we add the delta: change to “yes”, if we remove it, get it from the modified version of the reverse delta.
  • Create reverse deltas (in case you don’t have it already – you will always be able to “unapply” the deltas)
  • Import the result file, with merged objects, updated versionlist, .. .
  • Delete objects– Analyze the “UpdateResult” to figure out which objects need to be deleted – and delete them. This is really important: part of a delta, an object can be deleted, but by importing a text file, you are not able to delete anything. So this needs to be managed separately
  • Compile uncompiled
  • Open workingfolder if requested

So .. that’s quite a sandwich, isn’t it. Applying deltas to a database is not just the Update-NAVApplicationObject CmdLet. Far from it. You need to do much more. But as you can see – you can easily script your way through it, so you can have one CmdLet, that does it all. And thanks to this script, I have been using DELTA-files much more than any text or fob file. Especially in situations where I need to be very flexible in regards to implementing “apps” in multiple versions/localizations of Microsoft Dynamics NAV.

It’s all available for you .. hope you use it! :)

Can I UnApply as well?

Yes, you can remove your customizations just as easily as you would import anything else – just as long as you create the “reverse delta” .. because removing your customization, is applying the reverse delta, and tell the script you’re removing by using the Deltatype “Remove”. As an example, this is the UnInstallscript for WaldoNAVPad:

$AppName = 'WaldoNAVPad'
$WorkingFolder = "C:\_Workingfolder\$($AppName)"
$ServerInstance = 'WaldoNAVPad_DEV'

Apply-NAVDelta -DeltaPath 'C:\_Workingfolder\WaldoNAVPad\CreateDeltas\AppFiles_Reverse\*.DELTA' `
    -TargetServerInstance $ServerInstance  `
    -Workingfolder $WorkingFolder `
    -OpenWorkingfolder `
    -SynchronizeSchemaChanges Force `
    -DeltaType Remove `
    -VersionList $AppName

Start-NAVIdeClient -Database $ServerInstance

Is this a replacement for Extensions?

Most definitely not. Extensions bring us much more… , like transportability (with the navx file), upgradability (and taking care of the data), per tenant customization, the promise it won’t cause conflicts, prerequisites, dependencies, … . The above, is merely a way to add customizations to a database by not importing text files, but importing deltas – to not face the risk to overwrite stuff you don’t want to overwrite. And a way to remove these customizations, without the risk of removing too much ;-). Call it “Extensions on a valium” ;-). Extremely useful when you’re developing small generic functionalities that have to be imported in different customer db’s, db versions, localizations, whatever… .

Viewing all 336 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>