Quantcast
Channel: Dynamics 365 Blog
Viewing all 773 articles
Browse latest View live

Useful Dialog Windows with .NET Interop and NAV 2009 R2

$
0
0

We have been asked recently how to display a Dialog window for the RoleTailored client that would collect a Directory path.

The old (good) Codeunit 412 “Common Dialog Management” was not suitable for that purpose (and honestly I would love to go for something more RTC oriented).

I thought myself, then, there may be a lot of useful Dialog windows based on System.Windows.Forms namespace.

  1. How to select a DIRECTORY
  • How to select a FILE
  • How to select a COLOR
  • How to select a PRINTER
  • Attached you will find 1 unbounded page object in TXT format.

    The code is fairly simple.

     

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

    Duilio Tacconi (dtacconi)         

    Microsoft Dynamics Italy                     

    Microsoft Customer Service and Support (CSS) EMEA

    - Thanks to Carsten Scholling from Microsoft Dynamics CSS Germany -


    Manage Max No. of XML Records to Send Option from RoleTailored Client (with .NET Interop)

    $
    0
    0

    With NAV 2009 R2 it is now possible to set how many XML records to send to RoleTailored Client (RTC), bypassing the previous hardcoded limitation of 5000 records.

    This could be done by installing platform hotfix KB 2492490 Build no. 32146 or higher.

    https://mbs2.microsoft.com/Knowledgebase/KBDisplay.aspx?scid=kb$EN-US$2492490&wa=wsignin1.0

     

    I just would like to explain how this Hotfix works in short.

    In order to bypass the previous hardcoded limit of 5000 Max no. of xml records to send, after installing the aforementioned platform hotfix or higher, you have to set a key like that on every RTC Client machine in the ClientUsersSettings.config file

    <add key="MaxNoOfXMLRecordsToSend" value="integerValue" />

    (where integerValue is an integer value that represents the max number of xml records to send when using the Export to Microsoft Office feature)

    Since the Classic Client handles this value directly from the IDE changing “Max. no. of XML records to send” property from Tools > Options, I have been requested if there is a more flexible way to manage this key in the ClientUsersSettings.config file.

    I have then tried, in this blog, to mimic what Classic Client does and developed a simple page and codeunit to let the user change dynamically the value for this key and if the key is not present, by adding it directly to the ClientUsersSettings.config file with the desired value (this would also make the Hotfix deployment faster).

    The code in the attached .txt file is quite simple and is based on .NET interoperability using the following:

    System.Environment

    http://msdn.microsoft.com/en-us/library/system.environment(v=VS.90).aspx

    System.IO

    http://msdn.microsoft.com/en-us/library/system.io(v=VS.90).aspx

    System.XML

    http://msdn.microsoft.com/en-us/library/y3y47afh(v=VS.90).aspx

     

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

    Duilio Tacconi (dtacconi)

    Microsoft Dynamics Italy

    Microsoft Customer Service and Support (CSS) EMEA

    A special thanks to Jorge Alberto Torres - DK-MBS NAV Development

    How to Add Shortcuts Menu in the RoleTailored Client

    $
    0
    0

    In the Microsoft Dynamics NAV 2009 R2 RoleTailored client environment the Shortcuts Menu feature, which was present in earlier versions of Dynamics NAV for Classic client, has not been added.

    In Classic client it is actually possible to Create and Open Shortcut from this Menu.

    I have then tried, in this blog, to mimic what Classic client does and developed a couple of pages and one table to let users achieve a similar functionality for RoleTailored client. The text object in this blog contains:

    1. Table 50300 “Shortcut”
    2. Page 50300 “Shortcut List”
    3. Page 50301 “Create Shortcut Worksheet”

    The first task is to add Page 50300 “Shortcut List” to a Department MenuSuite like in the below example:

    Secondarily it is needed to customize Navigation Pane using RoleTailored client in order to add a new Menu called, for example, “Shortcuts” with Page 50300 “Shortcut List” added to it.

    This could be done manually

    http://blogs.msdn.com/b/nav/archive/2011/03/30/nav-2009-tips-and-tricks-personalize-the-departments-menu.aspx

    or configuring that for a profile

    http://msdn.microsoft.com/en-us/library/dd301231.aspx

    Once you have fulfilled those 2 steps then your users may be good to go to Create and Open shortcuts in a similar way as it was and as it is actually for Classic client.

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

    Duilio Tacconi (dtacconi)

    Microsoft
    Dynamics Italy

    Microsoft Customer Service and Support (CSS) EMEA

    QR Codes for Microsoft Dynamics NAV

    $
    0
    0

    QR codes (abbreviated from Quick Response code) are appearing in many different places today, and they are found to be quick and efficient when it comes to working with mobile phones and other devices which can read them. QR Code is a multipurpose instrument and it can hold all sorts of different types of valuable information like your company’s or your salesperson’s contact details, sales invoice information, promotional codes, location information, checksums, amounts, web links etc., which you can read using a QR code reader to automate some of the routine manual processes, like typing in things manually. 

    The QR code is a two-dimensional data-matrix which can be decoded very fast. The format is clearly defined and published as an ISO standard.

    QR code

    As the Windows Phone 7.5 update, code name “Mango”, rolls out to customers, it makes it even more relevant to use the QR codes, as you can now use your Windows phone camera to scan QR codes by bringing them into camera view. Bing will recognize QR codes and will help you save and use the information encoded in it.

    In some countries, popularity of QR codes has grown so much, that their usage is now considered a national standard. Our team has recently released an update for the Mexican market, where we added QR codes to several major Microsoft Dynamics NAV documents. And we thought – why don’t we let everyone else enjoy this new cool feature?

    The update is available for NAV 5.0 SP1, NAV 2009 SP1 and NAV 2009 R2 versions of the product.

    URLs for Microsoft Dynamics NAV 2009 SP1 and R2:

    https://mbs.microsoft.com/customersource/downloads/taxupdates/MSDNAV2009SP1ElectronicInvoice_Mexico   

    URLs for Microsoft Dynamics NAV 5.0 SP1 :

    https://mbs.microsoft.com/customersource/downloads/taxupdates/MSDNAV5SP1ElectronicInvoice_Mexico

    However, the only part you need from it is the MXElectronicInvoice.msi file included in the package. Note that the .msi file is exactly the same for both versions of NAV.

    Here is what you have to do to get your data encoded into a QR code:

    1. Run the installer to deploy the dll we shipped for this update. Among other things, the dll includes QRCodeProvider  and IBarCodeProvider classes which we can use.

    2. Add a BLOB field which will be storing the QR Code image into the Sales Invoice Header table for example:

    3. Remember to set the SubType property to Bitmap if you would like to use the QR code on pages:

    4. You can now use the following code to generate a QR code image, which for demo purposes will be saved into a first found posted sales invoice (needless to say, you should be doing it on a test database ;) ) In this example we will encode a contact card with some predefined details.

    OBJECT Codeunit 50001 QR Code Mgt.
    {
      OBJECT-PROPERTIES
      {
        Date=;
        Time=;
        Modified=Yes;
        Version List=QR Code;
      }
      PROPERTIES
      {
        OnRun=VAR
                CompanyInfo@1170000004 : Record 2000000006;
                SalesInvoiceHeader@1170000003 : Record 112;
                TempBlob@1170000002 : Record 99008535;
                QRCodeInput@1170000000 : Text[1024];
                QRCodeFileName@1170000001 : Text[1024];
              BEGIN
                // Save a QR code image into a file in a temporary folder
                QRCodeInput := CreateQRCodeInput('John,Doe','+555 1231231','john@doe.zzz','www.johndoe.zzz');
                QRCodeFileName := GetQRCode(QRCodeInput);
                QRCodeFileName := MoveToMagicPath(QRCodeFileName); // To avoid confirmation dialogue on RTC

               // Load the image from file into the BLOB field
                CLEAR(TempBlob);
                ThreeTierMgt.BLOBImport(TempBlob,QRCodeFileName,FALSE);
                IF SalesInvoiceHeader.FINDFIRST THEN BEGIN
                  SalesInvoiceHeader."QR Code" := TempBlob.Blob;
                  SalesInvoiceHeader.MODIFY;
                END;

               // Erase the temporary file
                IF NOT ISSERVICETIER THEN
                  IF EXISTS(QRCodeFileName) THEN
                    ERASE(QRCodeFileName);

                MESSAGE('Done!');
              END;
      }
      CODE
      {
        VAR
          ThreeTierMgt@1170000001 : Codeunit 419;

        LOCAL PROCEDURE CreateQRCodeInput@1020046(Name@1020000 : Text[80];PhoneNo@1020002 : Text[80];EMail@1020003 : Text[80];URL@1170000000 : Text[80]) QRCodeInput : Text[1024];
        BEGIN
          QRCodeInput :=
            'MECARD:' +
            'N:' + Name + ';' +
            'TEL:' + PhoneNo + ';' +
            'EMAIL:' + EMail + ';' +
            'URL:' + URL + ';';
        END;

        LOCAL PROCEDURE GetQRCode@1020038(QRCodeInput@1020001 : Text[1024]) QRCodeFileName : Text[1024];
        VAR
          IBarCodeProvider@1020000 : Automation "{89F54BC4-E6C9-44BA-8574-86568625BFF8} 1.0:{9FE38730-1A3C-4B84-A8C2-AFAC6A90E641}:'Microsoft Dynamics Nav MX Services'.IBarCodeProvider";
        BEGIN
          GetBarCodeProvider(IBarCodeProvider);
          QRCodeFileName := IBarCodeProvider.GetBarCode(QRCodeInput);
        END;

        PROCEDURE GetBarCodeProvider@1020001(VAR IBarCodeProvider@1020000 : Automation "{89F54BC4-E6C9-44BA-8574-86568625BFF8} 1.0:{9FE38730-1A3C-4B84-A8C2-AFAC6A90E641}:'Microsoft Dynamics Nav MX Services'.IBarCodeProvider");
        VAR
          QRCodeProvider@1020002 : Automation "{89F54BC4-E6C9-44BA-8574-86568625BFF8} 1.0:{69FEA5E6-0A76-4555-B74B-F170956B0098}:'Microsoft Dynamics Nav MX Services'.QRCodeProvider";
        BEGIN
          IF ISCLEAR(QRCodeProvider) THEN
            CREATE(QRCodeProvider,TRUE,TRUE);
          IBarCodeProvider := QRCodeProvider;
        END;

        PROCEDURE MoveToMagicPath@1170000000(SourceFileName@1170000000 : Text[1024]) DestinationFileName : Text[1024];
        VAR
          FileSystemObject@1170000001 : Automation "{F935DC20-1CF0-11D0-ADB9-00C04FD58A0B} 1.0:{0D43FE01-F093-11CF-8940-00A0C9054228}:'Windows Script Host Object Model'.FileSystemObject";
        BEGIN
          DestinationFileName := ThreeTierMgt.ClientTempFileName('','');
          IF ISCLEAR(FileSystemObject) THEN
            CREATE(FileSystemObject,TRUE,TRUE);
          FileSystemObject.MoveFile(SourceFileName,DestinationFileName);
        END;

        BEGIN
        END.
      }
    }

    5. With the image saved in the BLOB field, it is now “business as usual” to add it to a report. You can see, for example, how company logo is added to the standard NAV document reports. NB. Don’t forget to run CALCFIELDS on the "QR Code" field before you display its content. :-)

    6. And finally – run the report to see the QR code which you or your customers can scan, for example, with your favorite  Windows 7.5 mobile phone:

     

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

     

    Best regards,

    Microsoft Dynamics NAV ERM Team

    Dmitry Chadayev, Program Manager

    Filtering on Dimension Values

    $
    0
    0

    Microsoft Dynamics NAV supports unlimited dimensions and unlimited dimension values. You can create as many as you want, and you can use those all across the application. You can give two of these dimensions special treatment by setting them up as global dimensions. What is special about the global dimensions is that their values are stored directly on the records they belong to. All other dimension values are stored in a separate table. This means that you can filter on these two dimensions. In many places in the standard application, we have placed Global Dimension Filter fields that can be used to filter FlowFields. However, to get any data on any of the other dimensions, you would have to rely on Analysis Views to retrieve the information.

    In Microsoft Dynamics NAV 2013, the dimensions functionality has been heavily redesigned. Instead of storing all individual dimension values for each record in separate tables, each unique combination of dimensions and values gets an ID, and this dimension set ID is stored directly on the record that those values belong to. With this change, we have taken an important step: to store all information about dimensions and their values directly on the record.

    Since all the required information is stored on the record, though somewhat indirectly, it will now be possible to filter on any dimension and any dimension value. As it turns out, it is, and it’s not that hard to do. This blog entry describes some suggested patterns for using filters on dimension set IDs.

    As mentioned, the records contain dimension set IDs, which are integers that represent the combination of dimension values for a specific record. The biggest problem is to convert a typical filter on a dimension into a filter of dimension set IDs. Fortunately, we already have a few functions in Microsoft Dynamics NAV that can provide that information. With these functions in mind, we can build a page where you can input any combination of dimensions and dimension values in the form of filters, and you can then calculate the corresponding set of dimension set IDs. With all of these IDs, we can build one long filter string and use it to filter on the dimension set ID field. This enables, with relative ease, direct filtering on Dimension Values.

    So here’s what that page could look like (as text representation):

    OBJECT Page 50000 Dimension Set ID Filter
    {
      OBJECT-PROPERTIES
      {
        Date=;
        Time=;
        Version List=;
      }
      PROPERTIES
      {
        SourceTable=Table348;
        PageType=List;
        SourceTableTemporary=Yes;
        OnAfterGetRecord=BEGIN
                           SetDimensionValueFilter
                         END;

        OnNewRecord=BEGIN
                      DimensionValueFilter := ''
                    END;

        OnDeleteRecord=BEGIN
                         TempDimensionValue.SETRANGE("Dimension Code",Code);
                         TempDimensionValue.DELETEALL;
                         DELETE;
                         EXIT(FALSE)
                       END;

      }
      CONTROLS
      {
        { 1   ;    ;Container ;
                    ContainerType=ContentArea }

        { 4   ;1   ;Group     ;
                    GroupType=Repeater }

        { 2   ;2   ;Field     ;
                    SourceExpr=Code;
                    TableRelation=Dimension.Code }

        { 3   ;2   ;Field     ;
                    CaptionML=ENU=Dimension Value Filter;
                    SourceExpr=DimensionValueFilter;
                    OnValidate=BEGIN
                                 InsertDimensionValues(DimensionValueFilter)
                               END;

                    OnLookup=VAR
                               DimensionValue@1000 : Record 349;
                             BEGIN
                               DimensionValue.LookUpDimFilter(Code,Text);
                               EXIT(TRUE)
                             END;
                              }

      }
      CODE
      {
        VAR
          TempDimensionValue@1001 : TEMPORARY Record 349;
          DimensionValueFilter@1000 : Text;

        LOCAL PROCEDURE GetFilterString@28() Filter : Text;
        VAR
          DimensionMgt@1000 : Codeunit 408;
          SelectionFilterManagement@1001 : Codeunit 46;
          NextFilterChunk@1002 : Text;
        BEGIN
          IF FINDSET THEN
            REPEAT
              TempDimensionValue.SETRANGE("Dimension Code",Code);
              DimensionMgt.GetDimSetIDsForFilter(Code,
                SelectionFilterManagement.GetSelectionFilterForDimensionValue(TempDimensionValue))
            UNTIL NEXT = 0;
          NextFilterChunk := DimensionMgt.GetNextDimSetFilterChunk(1024);
          WHILE NextFilterChunk <> '' DO BEGIN
            Filter += NextFilterChunk;
            NextFilterChunk := DimensionMgt.GetNextDimSetFilterChunk(1024)
          END
        END;

        LOCAL PROCEDURE InsertDimensionValues@1(NewFilter@1000 : Text);
        VAR
          DimensionValue@1001 : Record 349;
        BEGIN
          TempDimensionValue.SETRANGE("Dimension Code",Code);
          TempDimensionValue.DELETEALL;
          DimensionValue.SETRANGE("Dimension Code",Code);
          DimensionValue.SETFILTER(Code,NewFilter);
          IF DimensionValue.FINDSET THEN BEGIN
            TempDimensionValue."Dimension Code" := DimensionValue."Dimension Code";
            REPEAT
              TempDimensionValue.Code := DimensionValue.Code;
              TempDimensionValue.INSERT
            UNTIL DimensionValue.NEXT = 0
          END
        END;

        LOCAL PROCEDURE SetDimensionValueFilter@2();
        VAR
          SelectionFilterManagement@1000 : Codeunit 46;
        BEGIN
          TempDimensionValue.SETRANGE("Dimension Code",Code);
          DimensionValueFilter :=
            SelectionFilterManagement.GetSelectionFilterForDimensionValue(TempDimensionValue);
          TempDimensionValue.SETRANGE("Dimension Code")
        END;

        PROCEDURE LookupFilter@6() : Text;
        VAR
          DimSetIDFilterPage@1001 : Page 50000;
        BEGIN
          DimSetIDFilterPage.SetTempDimTables(Rec,TempDimensionValue);
          DimSetIDFilterPage.EDITABLE(TRUE);
          DimSetIDFilterPage.RUNMODAL;
          DimSetIDFilterPage.GetTempDimTables(Rec,TempDimensionValue);
          EXIT(GetFilterString)
        END;

        PROCEDURE GetTempDimTables@8(VAR NewDimension@1000 : Record 348;VAR NewDimensionValue@1001 : Record 349);
        BEGIN
          NewDimension.COPY(Rec,TRUE);
          NewDimensionValue.COPY(TempDimensionValue,TRUE)
        END;

        PROCEDURE SetTempDimTables@3(VAR NewDimension@1000 : Record 348;VAR NewDimensionValue@1001 : Record 349);
        BEGIN
          COPY(NewDimension,TRUE);
          TempDimensionValue.COPY(NewDimensionValue,TRUE)
        END;

        BEGIN
        END.
      }
    }

    Let’s look at some of the elements on this page in more detail:

    • You’ll notice that this page uses SourceTableTemporary=Yes
    • The OnAfterGetRecord has some code to handle the deletion of records on this page. As noted above, we are dealing with temporary records, and in this case we do not want to run the OnDelete trigger from the Dimension table so we need to handle this manually.
    • The page has 3 public functions. The only one we’ll be using to call this page is the function LookupFilter, which will return a string representing the Dimension Set ID filter. I have used a little trick to make it easy to implement the calling of this page in a one-liner by having the RUNMODAL in the LookupFilter function. To allow the values entered on the page to be saved across runs, I added the functions GetTempDimTables and SetTempDimTables.
    • Instead of storing the filter string of values for each dimension, I store the dimension values in a temporary table. This makes it easier to call the functions in Codeunit 408 DimensionManagement that we need to get the dimension set IDs. The local function InsertDimensionValues converts the dimension values filter into Dimension Value records in the temporary table.

    The next step is to implement an action on a page such as General Ledger Entries. All we have to do is to add a single action and a variable. Make the variable a global if you’d like to save the values when the action is re-run , or make it a local to make the page ‘forget’ what was entered before.

    Here’s the Global Variable and the action that I added to the General Ledger Entries page (just below Action 50 GLDimensionOverview):

          DimSetIDFilterPage@1001 : Page 50000;

          { 3       ;2   ;Action    ;
                          Ellipsis=Yes;
                          CaptionML=ENU=Set Dimension Filter;
                          Image=Filter;
                          OnAction=BEGIN
                                     SETFILTER("Dimension Set ID",DimSetIDFilterPage.LookupFilter)
                                   END;
          }

    Now that we have the page and the action, we can run it and see what it looks like and how the filtering will work:
    In page 20 General Ledger Entries, in the Entry group, choose Set Dimension Filter.

    The new page that I added opens, and you can use the lookup on the Code column and the Dimension Value Filter field to select the values you want filtered as shown in the following screenshot:

    When you choose the OK button, the General Ledger Entries page will be filtered by the corresponding dimension set IDs that will be shown on the page as illustrated by the following screenshot:

    This was a fairly simple example to show how you can use dimension set IDs. But of course we can make the filter more complex. For example, if you want to know which records have the combination of AREA=30, BUSINESSGROUP=OFFICE and SALESPERSON=JR, you could set up a filter as shown in the following screenshot:

    We can also find out which records do not have a value for one or more dimensions.
    For example, the following screenshot illustrates a filter to show records with AREA 30 or 40 that do not have a value for PROJECT and SALESPERSON:

    You can enter any filter using all of the operators you already know, such as .., <>, & and |.

    This seems nice so far, but I would also want to be able to apply these filters to a page like Chart of Accounts and have the amounts reflect the applied filters. This change requires a new field and a small change to the FlowFields on table 15 G/L Account.

    The new field will be a FlowFilter field:

        { 50000;   ;Dimension Set ID Filter;Integer    ;FieldClass=FlowFilter }

    Additionally, change the CalcFormula for each FlowField that you want filtered such as the following example for field 32 Net Change:

    CalcFormula=Sum("G/L Entry".Amount WHERE (G/L Account No.=FIELD(No.),
                                              G/L Account No.=FIELD(FILTER(Totaling)),
                                              Business Unit Code=FIELD(Business Unit Filter),
                                              Global Dimension 1 Code=FIELD(Global Dimension 1 Filter),
                                              Global Dimension 2 Code=FIELD(Global Dimension 2 Filter),
                                              Posting Date=FIELD(Date Filter),
                                              Dimension Set ID=FIELD(Dimension Set ID Filter)));

    The action on page 16 Chart of Accounts is very similar to the one on page 20 General Ledger Entries that we created earlier. We just need to change the field we apply the filter to, so change the code to the following:

          DimSetIDFilterPage@1003 : Page 50000;

          { 5       ;3   ;Action    ;
                          Ellipsis=Yes;
                          CaptionML=ENU=Set Dimension Filter;
                          Image=Filter;
                          OnAction=BEGIN
                                     SETFILTER("Dimension Set ID Filter",DimSetIDFilterPage.LookupFilter)
                                   END;
          }

    That was it! Now you can filter directly on any dimension and any combination of dimensions on the chart of accounts and have the amounts be filtered. When you drill down on an amount, the filter will be carried over so you can see exactly which records make up the sum.

    There are many more pages in the standard application for Microsoft Dynamics NAV where you can add actions such the two described above. So please add these filters wherever you find it useful.

    Feel free to share your thoughts and comments on the feature and the code!

    -Gert Robyns

    Crash Dump Creation

    $
    0
    0

    Actually, NAV should never crash, but sometimes there are circumstances that cause a crash of one of the NAV processes or services. For the user and the administrator, it is usually not clear why a component crashes. Most of the time the application event log does not show enough information to determine root causes of the crash.

    A "crash dump" provides very good information about the module and code which caused the crash. The crash dumps can be analyzed by Microsoft Dynamics NAV support and help a lot to find the problem.

    With the "Debug Diagnostic Tool v1.1. 2" it is very easy to create the crash dump files in case of a crash.

    The tool can be downloaded under the following link: http://www.microsoft.com/en-us/download/details.aspx?id=26798

    Note: If the operating system is not configured for "en-us" for region and language, then the following steps are necessary to install the Debug Diagnostic
    Tool v1.2:

    1. Create a local group named "Users" (via Control Panel-> Administrative Tools -> Computer Management -> Local users and groups).
    2. Add yourself to this local group.
    3. Grant full access to this group to DebugDiag installation folder.
    4. Install DebugDiag.
    5. You will get an error removing the backup files!
    6. During this error - access the install directory (as administrator) while the installation is running and the error is shown.
    7. Copy the files to another directory (e.g. C:\Temp\DebugDiagSave).
    8. Complete the registration (failed - files deleted).
    9. Create the folder DebugDiag again on c:\Program Files (if you use x64 or the program folder for x86).
    10. Copy the files back to the installation directory.
    11. Run the installation once again.
    12. Run Register.bat from the installation folder to register all COM objects.

    After you install and run the application, you are welcomed by the "Rule Wizard."

    For the crash dump creation, you choose Crash and then choose the Next button. Following is the selection of what you would like to check. For NAV, it is necessary to select "a specific process":

    The next selection window shows all processes. Here, you can select, for example, "Microsoft.Dynamics.NAV.Client.exe".

    With "Advanced Configuration" it now possible to set the granularity of process monitoring.

    The next screen shows the path where the crash dump is saved. You must select a hard disk with enough capacity. Crash dumps can be very large in some circumstances (e.g. if you selected "Full Userdump" in the "Advanced Configuration" screen).

    In a last step the rule must be activated:

    Once the created rule was activated, "Debug Diag" monitors the configured process.

    In case of a crash, "Debug Diag" creates now the necessary .dmp files in the specified folder and additional log files in the installation folder of "Debug Diag" (e.g. "C:\Program Files\DebugDiag\Logs").

    I hope the described steps help you to create the important crash dump information.

    Frank Wurzel

    Microsoft Dynamics Germany

    Example of How to use SQL Tracing Feature to Profile AL Code

    $
    0
    0

    Enabling Tracing in Microsoft Dynamics NAV 2013

    Microsoft Dynamics NAV 2013 has a feature that allows you to see the AL call stack for a SQL commands. Here I am going to describe how it can be used to profile your application code.

    There are multiple steps required to start tracing.

    First, you need to start the Session List page. This is the same page that you need to open to start the debugger. So you need to start the development environment, then go to Tools, Debugger, Debug Session. You will get the Session List page.

    This window contains Start Full SQL Tracing/Stop Full SQL Tracing buttons. As well there is a SQL Tracing editable check box on each line.

    Start Full SQL Tracing/Stop Full SQL Tracing enables/disables tracing for all new and existing sessions and the SQL Tracing check box enables/disables tracing for a given session.

    Let’s assume we want to profile one of the sessions. Then we need to enable tracing for it, for example by clicking the check box.

    Configuring SQL Profiler

    The important part here is to select appropriate events. In this case we are interested in seeing SQL statements’ text. To achieve that we need to enable SP:StmtCompleted and SQL:BatchCompleted events. The setup should be like on the following picture.

    It allows seeing SQL queries for all statements issued from the AL.

    After this if you do some operations in the client, for example open the Sales Orders page, you will see comments in SQL Server Profiler.

    This is an example of what you can get.

    All SQL statements in-between consecutive comments correspond to the AL statement from the first comment. For example in previous screenshot, CALCFIELDS issues six SQL queries.

    The SQL profiler will contain Get connection from the pool and Return connection to the pool comments.

    These comments correspond to the events when the connection is retrieved and returned to the Microsoft DynamicsNAV Server connection poll respectively. These comments are needed to separate SQL queries issues from different clients on the same SQL connection. The SQL statement that corresponds to these comments is issued by Microsoft Dynamics NAV Server but not originated from AL.

    Comments that contain only user name also correspond to SQL statements issued by Microsoft Dynamics NAV Server but not originated from AL. For example Microsoft Dynamics NAV Server executes queries to calculate calculated fields shown on the fact boxes. We need to have this type of comments because Microsoft Dynamics NAV Server might execute an SQL query without returning connection to the pool and not originated from AL.

    Filtering Your Statements

    In the SQL profiler you will see statements from the different connections. This is because you can have multiple clients running or, for example, you can have SQL reporting services or some other service enabled. It is important to filter out everything except what is coming from the client you are profiling.

    To do that for each SQL statement you need to find first previous comment with the same SPID. If this comment is Return connection to the pool then this SQL statement is not originated from the AL code of the client that is being profiled.

    User name in the comment identifies the client by which SQL statement is generated.

    Collecting the Data and Analyzing

    Before the profiler is started, the server should be "warmed up," otherwise there is going to be a lot of queries for metadata reading. The scenarios that are going to be profiled should be executed previously at least once.

    After the SQL trace is collected it should be saved into the SQL table.

    Bellow I have an example of an SQL script that finds the most expensive AL statements. The trace was saved into the NAV7_Trace table in the master database.

    SELECT * FROM [master].[dbo].[NAV7_Trace] --query the trace table content

    DECLARE @ApplicationName NVARCHAR(100)
    DECLARE @GetConnection NVARCHAR(100)
    DECLARE @ReturnConnection NVARCHAR(100)
    DECLARE @ContainsUserName NVARCHAR(100)
    DECLARE @EmptyCallStack NVARCHAR(100)

    SET @ApplicationName = 'Microsoft Dynamics NAV Service'
    SET @GetConnection = '%Get connection%'
    SET @ReturnConnection = '%Return connection%'
    SET @ContainsUserName = '%User: Your user name%'
    SET @EmptyCallStack = '/*__User: Your user name__*/'

    IF OBJECT_ID('tempdb..#ProfilerData') IS NOT NULL
     DROP TABLE #ProfilerData

    SELECT * INTO #ProfilerData FROM
    (
     SELECT
      [RowNumber] AS [SqlStatement RowNumber],
      [TextData] AS [SQL Statement],
      [Reads],
      [Writes],
      [Duration],
      [StartTime],
      [EndTime],
      [SPID]
     FROM [master].[dbo].[NAV7_Trace]
     WHERE
      [ApplicationName] = @ApplicationName and
      [TextData] not like @ContainsUserName and
      [TextData] not like @GetConnection and
      [TextData] not like @ReturnConnection and
      [TextData] not like @EmptyCallStack
    ) SqlStatement
    CROSS APPLY
    (
     SELECT TOP 1
      [RowNumber] AS [Stack RowNumber],
      [TextData] AS [Call Stack]
     FROM [master].[dbo].[NAV7_Trace]
     WHERE
      [SPID] = SqlStatement.[SPID] and
      [RowNumber] < SqlStatement.[SqlStatement RowNumber] and
      [ApplicationName] = @ApplicationName and
      [TextData] like @ContainsUserName
     ORDER BY [RowNumber] DESC
    ) AS Stack

    SELECT * FROM #ProfilerData --this table contains mapping of SQL statements to the AL call stack

    SELECT
     CAST([Call Stack] AS NVARCHAR(4000)) AS [Call Stack],
     SUM(Duration) AS [Sum Duration],
     AVG(Duration) AS [Average Duration],
     MIN(Duration) AS [Min Duration],
     MAX(Duration) AS [Max Duration],
     SUM(Reads) AS [Sum Reads],
     SUM(Writes) AS [Sum Writes]
    FROM #ProfilerData
    GROUP BY CAST([Call Stack] AS NVARCHAR(4000))
    ORDER BY [Sum Duration] DESC -- this query finds the most expensive AL statements

    Result of previous query shows the most expensive AL calls. Second and fifth rows show the total time spent in the SQL calls issued by the server and not originated from AL.

    Also you can create a query which finds SQL statements which correspond to appropriate call stacks.

    SELECT * FROM #ProfilerData
    WHERE [Call Stack] like '%"Sales-Post"(CodeUnit 80).OnRun(Trigger) line 1556%'

    SELECT * FROM #ProfilerData
    WHERE [Call Stack] like '%"Gen. Jnl.-Post Line"(CodeUnit 12).InsertGLEntry line 59%'

    SELECT * FROM #ProfilerData
    WHERE [Call Stack] like '%"Item Jnl.-Post Line"(CodeUnit 22).ApplyItemLedgEntry line 252%'

    It is also easy to create a query which will count the number of times the same call stack occurs in the trace.

    SELECT
     COUNT(CAST([TextData] AS NVARCHAR(4000))) AS Count,
     CAST([TextData] AS NVARCHAR(4000))
    FROM [master].[dbo].[NAV7_Trace]
    WHERE
     [ApplicationName] = @ApplicationName and
     [TextData] like @ContainsUserName and
     [TextData] not like @GetConnection and
     [TextData] not like @ReturnConnection and
     [TextData] not like @EmptyCallStack
    GROUP BY CAST([TextData] AS NVARCHAR(4000))
    ORDER BY COUNT(CAST([TextData] AS NVARCHAR(4000))) DESC

    -Dmytro Sitnik
     

    Excel Buffer Using Open XML Instead of Excel Automation (Part 1 of 2)

    $
    0
    0

    In Microsoft Dynamics NAV, there are several areas that enable the user to perform analysis in Microsoft Excel. Areas such as importing and exporting budgets, analysis by dimensions, and a number of selected reports all use Excel Buffer to export data to Microsoft Excel. In Microsoft Dynamics NAV 2013, the exporting technology has changed to gain better performance in the export as well as to enable more advanced customization capabilities. In Microsoft Dynamics NAV 2013, Excel Buffer has changed from being a chatty client automation solution to one that uses the Open XML SDK 2.0 for Microsoft Office and renders the Excel workbook on the server side.

    This article explains some of the changes that were made to the Excel Buffer table (table 370) and gives examples of how to use and extend the table with the new Open XML functionality from C/AL.
     
    Excel Buffer works as a temporary table which is used when you import or export data between Microsoft Dynamics NAV and Microsoft Excel. For the export part, you have an additional option to add formulas and formatting of the Excel cells.
     
    When a Excel workbook is rendered, you use the File Management codeunit (codeunit 419) to download the file from the server to the client and then use Office .NET Interop to open it in Excel and do final formatting, such as Auto Fit Columns.
     
    A couple of objects that use Excel Buffer are the reports Import Budget from Excel (report 81) and Export Budget to Excel (report 82). These objects are a good way of getting inspiration on how to use it. But in this article, I'm extracting some of this logic to give a simple and clear way of using import and export.

    Example 1: Formatting and summarization

    1. In the Microsoft Dynamics NAV Development Environment, create a new codeunit.

    2. Add a new temporary record variable for table 370 called ExcelBuffer.

    3. Add the following lines of code, which include formulas and simple formatting to the Excel Buffer table.

    // Copyright © Microsoft Corporation. All Rights Reserved.
    // This code released under the terms of the
    // Microsoft Public License (MS-PL, http://opensource.org/licenses/ms-pl.html.

    // Add values to the Excel Buffer table.
    ExcelBuffer.NewRow;
    ExcelBuffer.AddColumn('Header',FALSE,'',FALSE,FALSE,FALSE,'',ExcelBuffer."Cell Type"::Text);
     
    ExcelBuffer.NewRow;
    ExcelBuffer.AddColumn(123.45,FALSE,'',FALSE,FALSE,FALSE,'',ExcelBuffer."Cell Type"::Number);
     
    ExcelBuffer.NewRow;
    ExcelBuffer.AddColumn(-223.45,FALSE,'',FALSE,FALSE,FALSE,'',ExcelBuffer."Cell Type"::Number);
     
    // Add formula, second parameter TRUE.
    ExcelBuffer.NewRow;
    ExcelBuffer.AddColumn('SUM(A2:A3)',TRUE,'',FALSE,FALSE,FALSE,'',ExcelBuffer."Cell Type"::Number);
     
    // Include custom format for the cell.
    ExcelBuffer.NewRow;
    ExcelBuffer.AddColumn('SUM(A2:A4)',TRUE,'',FALSE,FALSE,FALSE,'#,#0.0;[blue](#,#0.0)',ExcelBuffer."Cell Type"::Number);
     
    // Create and write the content from the Excel buffer table to Open XML Excel server file.
    ExcelBuffer.CreateBookAndOpenExcel('Sheet ABC','Header',COMPANYNAME,USERID);

    4. After adding the lines, compile and run the codeunit from the Microsoft Dynamics NAV Development Environment. The Microsoft Dynamics NAV Windows client will open and execute the codeunit and Excel with the data from the codeunit. The formatting capabilities are shown in the Excel sheet, including summarization and coloring.

    Example 2: Reading from Excel sheet
     
    Before trying the following example, you need to save the Excel file from Example 1 in the following location: C:\TEMP\ExcelBufferReadBookScenario.xlsx.
     
    This example will illustrate the reading capabilities that are possible.

    1. Create a new codeunit.
    2. Add a new temporary record variable for table 370 called ExcelBuffer.
    3. Add a new text variable called MessageValue.
    4. Add the following lines of code.
     
    // Copyright © Microsoft Corporation. All Rights Reserved.
    // This code released under the terms of the
    // Microsoft Public License (MS-PL, http://opensource.org/licenses/ms-pl.html.ExcelBuffer.DELETEALL;

     
    ExcelBuffer.OpenBook('C:\TEMP\ExcelBufferReadBookScenario.xlsx','Sheet ABC');
    ExcelBuffer.ReadSheet();

    IF ExcelBuffer.FINDFIRST THEN
      REPEAT
        MessageValue := MessageValue + ExcelBuffer."Cell Value as Text" + '\';
      UNTIL ExcelBuffer.NEXT = 0;

    MESSAGE(MessageValue);
     
    5. Run the codeunit from the Microsoft Dynamics NAV Development Environment. The Microsoft Dynamics NAV Windows client opens and the content of the Excel workbook is now read into the Excel Buffer table and presented to the user in a message box for each row.

    -Lars-Bo Christensen


    Use Open XML to Extend the Excel Buffer Functionality (Part 2 of 2)

    $
    0
    0

    In Microsoft Dynamics NAV, there are several areas that enable the user to perform analysis in Microsoft Excel. Areas such as importing and exporting budgets, analysis by dimensions, and a number of selected reports all use Excel Buffer to export data to Microsoft Excel. In Microsoft Dynamics NAV 2013, the exporting technology has changed to gain better performance in the export as well as to enable more advanced customization capabilities. In Microsoft Dynamics NAV 2013, Excel Buffer has changed from being a chatty client automation solution to one that uses the Open XML SDK 2.0 for Microsoft Office and renders the Excel workbook on the server side.

    In the blog post Excel Buffer Using Open XML Instead of Excel Automation - (Part 1 of 2), I demonstrated simple write and read scenarios. This blog post will go into details on how you can use the Open XML API in C/AL to extend the Excel Buffer functionality.
     
    I will show what can be done with some additional cell formatting. The following areas will be covered:

    • Exposing the Microsoft Dynamics NAV Open XML worksheet writer object in table 370.
    • Using the decorator parameter to provide additional formatting.
    • Creating an OpenXml Helper class to interact with the lack of support for generics in C/AL.
    • Showing the file management capabilities by download file to client from server, without opening it directly into Excel.

    Exposing the Microsoft Dynamics NAV Open XML worksheet Writer Object in Table 370

    You need to expose the workbook writer object from table 370 by adding the following method. In this way, you open for applying functionality from the extensive and verbose Open XML API. The easiest way is to export the Excel Buffer table (table 370), add the following procedure, and then reimport and compile the object.

    PROCEDURE GetWorkbookWriter@21(VAR WrkBookWriter@1000 : DotNet "'Microsoft.Dynamics.Nav.OpenXml, Version=7.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35'.Microsoft.Dynamics.Nav.OpenXml.Spreadsheet.WorkbookWriter");
    BEGIN
      WrkBookWriter := XlWrkBkWriter;
    END;
     
    Using the Decorator Parameter to Provide Additional Formatting

    Due to a limitation in C/AL, you cannot use generics as part of the C/AL DotNet object type. Therefore, you need to create a helper class that can handle the places where the Open XML SDK only uses generics. This helps avoid using unnecessary, complex looping of collections in C/AL code. Often tasks like this are much easier to implement in a DotNet helper class. An example of this is described below where I have a simple method that can append an Open XML element to a collection.
     
    To create the OpenXmlHelper class, do the following steps:
    1. Open Visual Studio.
    2. Create new C# class library project called OpenXmlHelper.
    3. Ensure that signing is enabled on the project, as the assembly needs to have a strong name key.
    4. Reference the DocumentFormat.OpenXml.dll from the Open XML SDK. The default installation is typically in the following location: C:\Program Files (x86)\Open XML SDK\V2.0\lib\DocumentFormat.OpenXml.dll.
    5. Rename Class1 to OpenXmlHelper.
    6. Add a using statement DocumentFormat.OpenXml to the file.
    7. Create the method: AppendChild by copying the following code.

    public static bool AppendChild(OpenXmlElement parent, OpenXmlElement child)
    {
      if (parent.HasChildren && parent.Contains(child))
      {
        return false;
      }
      parent.AppendChild(child);
      return true;
    }

    8. Compile and place the OpenXmlHelper in an OpenXML Add-ins folder for the client and server. The default installation will point to the following locations: C:\Program Files (x86)\Microsoft Dynamics NAV\70\RoleTailored Client\Add-ins\OpenXML and C:\Program Files\Microsoft Dynamics NAV\70\Service\Add-ins\OpenXML.
    9. Import the ExcelBuffer Extensibility codeunit (attached) and compile. This codeunit opens up to add decorations as well as download the file to the client.

    The first part of the codeunit is the same as in Excel buffer that we used in the blog post Excel Buffer Using Open XML Instead of Excel Automation - (Part 1 of 2), but I have added the workbookwriter. In addition, instead of opening the file in Excel I use the FileMangement codeunit functionality to download the file from the server to the client, to show a different way of working with the generated file.
     
    // Create new method on ExcelBuffer table to expose the WorkbookWriter.
    ExcelBuffer.GetWorkbookWriter(WrkbookWriter);
     
    // Call method that adds new font with a set of new characters.
    AddingFont(FontDecorator);
     
    // Add a new cell Formula using the new fontdecorator.
                WrkbookWriter.FirstWorksheet.SetCellFormula(4,'A','SUM(A2:A3)','#,#0.0;#,#0.0',FontDecorator);
     
    ExcelBuffer.CloseBook;

    // Download the Excel file from the server to client.
    ExcelBuffer.UTgetGlobalValue('ExcelFile',FileServerName);
    FileClientName := 'C:\Temp\ClientFile.xlsx';
    FileMgt.DownloadToFile(FileServerName,FileClientName);
    MESSAGE(FileClientName);
     
    In the codeunit, there is also an example of how the AddingFont method can be used:

    // Create new Font, cloned from existing Font.
    Font := WrkbookWriter.FirstWorksheet.DefaultCellDecorator.Font.CloneNode(TRUE);
     
    // Create a new Font Color.
    FontColor := FontColor.Color;
    FontColor.Rgb := FontHexValue.HexBinaryValue('0000EEEE');
    Font.Color := FontColor;
     
    // Create a new Font Size.
    FontSize := FontSize.FontSize;
    FontSize.Val := FontSizeDoubleValue.DoubleValue(30);
    Font.FontSize := FontSize;
     
    // Get the collection of Fonts that already exists.
    Fonts := WrkbookWriter.Workbook.WorkbookPart.WorkbookStylesPart.Stylesheet.Fonts;
     
    // Add the new font to the collection of fonts and increase the number of fonts by one.
    IF OpenXmlHelper.AppendChild(Fonts,Font) THEN
    Fonts.Count.Value := Fonts.Count.Value + 1;
     
    // Add the Font to a decorator.
    Decorator := WrkbookWriter.FirstWorksheet.DefaultCellDecorator;
    Decorator.Font := Font;
     
    Running the codeunit will save the file on the client in the following file: C:\temp\clientfile.xlsx. In the spreadsheet, the number is now formatted to a larger font and color.

    From here, it's only up to your own imagination and needs for how you want to extend with more methods, and so forth.

    The capability of the Open XML SDK API is your only limitation.

    -Lars-Bo Christensen

    Alternative Ways of Starting the Microsoft Dynamics NAV 2013 C/AL Debugger

    $
    0
    0

    The other day a partner asked me about how to start the Microsoft Dynamics NAV 2013 C/AL Debugger for another NAV Server instance than the one used by C/SIDE.

    The explanation is below, but first, let’s see how C/SIDE finds the NAV Server instance used for running and debugging.

    The Microsoft Dynamics NAV Server Instance used by C/SIDE for Run and Debug 

    When you Run an application object from the C/SIDE Object designer, C/SIDE is starting a Dynamics NAV Windows client with the application url constructed from:

    • NAV Server as defined in File/Database/Information Server Instance field (see Database Information)
    • Current selected Company in C/SIDE
    • Type and id of the application object you run

    Read more here: Starting the Windows Client at the Command Prompt, the URL parameter.

    This could be an example:

    Microsoft.Dynamics.Nav.Client.exe "DynamicsNAV://localhost:7046/dynamicsnav70/CRONUS International Ltd./RunPage?Page=42"

    The rest of the client configurations are taken from the default ClientUserSettings.config, which is typically found at %APPDATA%\Microsoft\Microsoft Dynamics NAV\70.

    The same principle is used when you start the debugger from within C/SIDE Tools/Debugger/Debug Session.

    All this works quite well in more confined setups where you can address different databases and NAV Servers if you are using the same authentication type for them all, but if you want to address different NAV Servers having for example different authentication types, you have to find other solutions – feel free to be inspired by how to.

    Define Alternative Ways of Starting the Microsoft Dynamics NAV 2013 C/AL Debugger

    Now let’s say you have made a dedicated NAV Test environment with:

    • a separate database
    • dedicated NAV Server(s)
    • another client authentication type than you are using in your Development environment
    • no development environment installed

    … and you want to debug some issue that appears in the Test environment from your development environment machine.

    The trick is to create a Dynamics NAV Windows client shortcuts on your development machine that specifies:

    Ending up in a combined command line description:

    Microsoft.Dynamics.Nav.Client.exe -settings:file  "DynamicsNAV://< Server>[:<port]>/<ServerInstance>/<Company>/debug"

    Or the short-form (all defaults to the .config settings):

    Microsoft.Dynamics.Nav.Client.exe -settings:file  "DynamicsNAV://///debug"

    Adding up to shortcut examples like this:

    Microsoft.Dynamics.Nav.Client.exe -settings:"%USERPROFILE%\Desktop\TestEnv.config" "DynamicsNAV://///debug"

    Microsoft.Dynamics.Nav.Client.exe -settings:"%USERPROFILE%\Desktop\ProdEnv.config" "DynamicsNAV://///debug"

    Or with full paths like this:

    "%ProgramFiles(x86)%\Microsoft Dynamics NAV\70\RoleTailored Client\Microsoft.Dynamics.Nav.Client.exe" -settings:"%USERPROFILE%\Desktop\TestEnv.config" "DynamicsNAV://///debug"

    Note

    The Windows client specifically understands the “special” /debug url, which is currently a “shorthand” for starting the client with:

    • -debug (that sets the CURRENTEXECUTIONMODE to Debug and groups the debugger clients separately in the taskbar under the debugger icon)
    • -shownavigationpage:0  (i.e. not showing a role center)
    • And do what’s defined in ApplicationManagement (Codeunit 1). LaunchDebugger (Trigger 60)
      • Typically, running the “Session List” (Page 9506)

    The debugger can be started this way because it is working similar to any Microsoft Dynamics NAV solution.

    -Hans Kierulff

    How to start any object in Role Tailored Client

    $
    0
    0

    Sometimes we need to run some specific object (page, report, codeunit or xmlport) on Microsoft Dynamics NAV 2013 Role Tailored Client. Let say we want to test report received from customer on our Cronus demo db.

    The easiest way is of course to run report directly from Microsoft Dynamics NAV 2013 Development Environment Object Designer. However, if we have few client versions on our pc then this way could be impossible

    Then we can add run object as action in initial role center.

    Again, I need to create action for every new object I want to test – unproductive.

    Faster way is to add new menu shortcut in menu suite 1010, then it populates to RTC menu.

    Nevertheless, this action need to be repeated for every object I want to test...

    Finally, I have chosen another way:

    1. I created new page which populates all runnable objects from Object table and added function to start “current record” object. (page 50100 attached)
    2. I added shortcut to this page in Menu suite 1010.

    Now I have:

    1. This page is searchable in RTC search menu – I can open it fast from any place in RTC.
    2. I can choose any object from object list and run it by double click on it or use action Run.

     

    Now whatever I import to object designer I can it run directly from RTC.
    Moreover, I can run even objects from standard application if I do not know how to find it in menus :)

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

    Gedas Busniauskas
    Microsoft
    Lithuania
    Microsoft Customer Service and Support (CSS) EMEA

    Replacing Field Values in Microsoft Dynamics NAV 2013

    $
    0
    0

    Have you ever had the need to replace some values from a list of values?  For example, you need to change the Postal/Zip Code for a number of your customers.  Most of the time you would need to write a codeunit or report to accomplish that or even have the customer update each entry by hand.  What if you needed to do the same with your vendors?  That would require either modifying the previous codeunit/report or possibly just creating a new object to handle this or again having the customer update each entry by hand. 

    Here is another option – a Replace page that you can add to any page with just a few lines of code. 


    Fig 1. Replace Page

    Field Descriptions:

    • Selected Field – use the AssistEdit () or type the field name to select the field that you want to replace values from.
    • Find What – the value that you want to find and replace with a new value.
    • Replace With – the value you want to replace the original value with.

    Options:

    • Match – you can either match the “Whole field” or “Any part of the field”.
    • Match Case – allows you to match only if the case matches exactly.
    • Replace Whole Field – allows you to replace the whole contents of the field with the new value.
    • Records in Data Set – this is an indicator as to how many records are in your current filtered dataset.

    How to implement the page:

    If you import the attached object text file and follow these steps you will be able to add this to almost any page (at least where it makes sense).  For this example I will be updating the Customer List (Page 22).

    1. Import and compile the attached object “Replace NAV2013.txt”.

    2. Design the Customer Page (21).

    3. Add a global variable named Replace and point it to Page 50050 (or whatever you may have changed it to prior to import)

    4. Create a new action.

    5. Go to the OnAction trigger for your new action and add the following code …

    6. Compile your object and you should be ready to use it.

     

    How it works:

    • The function LoadDataSet simply loads the appropriate filtered dataset.
    • The SetValidations function is optional.  If you are going to run the Replace page from a page that needs to run validations then you add the SetValidations function.  There are 2 parameters – ModifyLevel determines whether the modify trigger is run for that update.  The FieldLevel parameter is used to enable the field OnValidate triggers to be run.  The Field Level parameter is for all fields.  There is no way (at this point) to determine which fields will run the OnValidate and which ones won’t.

    Note:  When use this Replace page, keep in mind that the TIME and DATETIME fields are very sensitive from a matching perspective.  Even though the TIME may only appear to be HH:MM:SS on the page, there will probably be a millisecond component to the TIME as well.  So, to have the best luck, it would probably be best to copy the value of the cell that you want to replace and use that as the value in “Find What” field.  This is true with the DATETIME fields as well.

     

    These postings are provided "AS IS" with no warranties and confer no rights. You assume all risk for your use.

    Microsoft Dynamics NAV 2013 Reporting Design Guidelines

    $
    0
    0

    Reports join different views of data in one place. You design reports in Microsoft Dynamics NAV 2013 using the Microsoft Dynamics NAV Development Environment and Microsoft Reporting Services.

    For Microsoft Dynamics NAV 2013, we have created report design guidelines to help you take advantage of the power of Microsoft Reporting Services.

    Design Concepts

    The basis for the design guidelines is twofold: They should benefit the customer using a report and they should help the partner designing a report.

    For customers to be effective using reports in Microsoft Dynamics NAV 2013, the design of the reports should be such that the reports are:

    • Simple and clean — reports should not contain more information than necessary.
    • Easy to skim and read — optimizing visualizations helps the user going through a report.
    • Consistent — across all report types, reports can be categorized and each category can have the same principles.

    To help partners in designing reports, we have based the principles of the report design guidelines for development on:

    • Simplicity
      • Create a minimum of rules.
      • Use the default options that Visual Studio 2010 offers.
      • Use a standard palette for colors (Bright Pastel).
      • Create a checklist that can be used when designing a report.

    All reports in Microsoft Dynamics NAV 2013 can be classified as one of three types:

    • Documents — formal outgoing reports, for example, sales invoice and order confirmation reports.
    • Simple lists — internal reports that show data at one level with a unique key, row by row. Reports are mostly overviews of master data, for example, the Customer and Vendor List reports.
    • Grouped lists — internal reports that show more complex data grouped per key. Reports are mostly combinations of master data and its connected data, for example, the Customer Detail Aging and Customer/Item Sales reports.

    Examples

    Document Reports: Sales Invoice

    Simple list reports: Resource Usage

    Grouped list reports: Customer – Detail Trial Bal.

    Guidelines

    The following shows the Microsoft Dynamics NAV 2013 report design guidelines, wrapped in a checklist. It is divided into three sections: canvas, header and body.

    Example of a Document Report Design

    For more information, see Report Design Guidelines on MSDN.

    -Coen Overgaag

    Source Code Released for Lync Communication Add-In for Microsoft Dynamics NAV 2013

    $
    0
    0

    Today we can announce that the Lync Add-In is available for download in source code. This Add-In is a good sample for a nice integration with other fields on a page, yet still provides a very custom functionality and rendering.

    It has been shown at recent conferences and is also contained on the VPC for Microsoft Dynamics NAV 2013.

    Please read the related BlogPost here.

    -Christian Abeln

    Senior Program Manager

    Microsoft Dynamics NAV

     

    Application Test Toolset for Microsoft Dynamics NAV 2013

    $
    0
    0

    We recently shipped the Application Test Toolset for Microsoft Dynamics NAV 2013.

    The supplement is applicable to the following country releases of Microsoft Dynamics NAV 2013:

    W1, AU, CA, DE, DK, ES, FR, GB, IN, IT, MX, NL, NZ, SE and US

    The supplement contains the following per country:

    • Tools for managing and executing tests, capturing code coverage information, and selecting relevant tests out of available tests.
    • Between 7,000 – 9,000 real regression tests that we run, not dumbed-down sample tests.
    • Helper libraries for improving test development through reusing common functionality.
    • Searchable documentation of helper libraries with examples of how to use library functionality.

    You may attempt to apply the W1 version to other country versions, but you should expect parts of the toolset to not work.

    Installation

    To install the supplement, do the following:

    1. Download the supplement from PartnerSource here.
    2. Extract the content.
    3. Import the FOB file found under your corresponding country version folder. For example: .\AppTestToolsetNAV2013\DE\AppTestToolsetNAV2013-DE.fob
    4. After experimenting with the toolset, don’t forget to fill out the survey form and send it to Microsoft. We really appreciate your feedback and we rely on it to improve future versions of the toolset.

    How Do I Use This?

    The simplest way to make use of this supplement is to run the Test Tool page (130021) directly from the development environment. This launches the following page:

    Click Get Test Codeunits and then select All Test Codeunits.

    After Microsoft Dynamics NAV finishes loading all test codeunits, they are displayed on the Test Tool page. To run them, click the Run action and then select All.

    It will take about 1 – 2 hours depending on your machine and the setup to run all tests. When the run is completed it will show the results in the Test Tool page:

    Any changes done to the database through running of tests from the Test Tool are automatically rolled back using the Test Isolation testability feature of Microsoft Dynamics NAV 2013. (See the Additional Resources section in this post.)

    During typical development, it is unacceptable to have to wait hours to get results from tests, which is why we have built an advanced test selection feature to help identify the relevant tests. (See the Test Selection section in this post.)

    Alternatively, you can run individual tests or codeunits by selecting them and choosing either Active Line or Active Codeunit after you click the Run action.

    If any test fails, you can attach a debugger session and re-run the failing test. The debugger will then break at the line where the test failed and you will be able to inspect the call stack and examine variables to determine the underlying cause of the failure.

    Extending the Toolset With Your Own Tests

    After you have written your first test codeunit, you can easily integrate it into the tools we provide in this supplement.

    To include your own tests, in the Test Tool page, simply run the page from the development environment and click the action Get Test Codeunits and choose Select Test Codeunits. This will display a page listing all available test codeunits, including your own:

    Select the codeunits you would like to add to the tool and press OK. The new test codeunits appear at the bottom of the Test Tool list, and you can now select them and run them just like any of the tests we included.

    Again, Test Isolation prevents your tests from persisting changes to the database. During development it may be beneficial to actually see the output produced by the tests. It is possible to disable Test Isolation just by running the test codeunit directly from the development environment, however, instead we recommend attaching a debugger session, breaking at the test entry point, then stepping through test execution and inspecting variables to determine if your test is behaving as expected.

    Speeding Up Development of Your Own Tests

    The tests that we have developed are built on top of a layer of libraries that contain helper functionality to automate many aspects of Microsoft Dynamics NAV. For example, the library named Library – Sales contains functionality related to working with customers and sales documents, including creating new customers, sales headers, sales lines and posting sales documents. The library is extensive and has functionality in many areas of the product, such as finance, service, jobs, warehousing, inventory, etc.

    Instead of re-inventing the wheel when developing your own tests, we highly suggest that you look into our existing helper functionality for functions you can leverage.

    To help you find your way around the libraries, we have shipped a Microsoft Compiled HTML Help file (*.chm), which is bundled together with the .fob file you installed. When you open the .chm file, you are prompted with the following window:

    This lists all our libraries and the functions inside them. However, normally you don’t know which library to look for, You can search it from the Search tab. Try searching for "finance charge memo" and you will have a couple of choices to pick from:

    Code Coverage Tools

    Code coverage is the means of being able to track which part of the application code has been exercised during some activity. In Microsoft Dynamics NAV, code coverage is recorded by AL code line and in addition to knowing if a code line was exercised it also records the number of times it was recorded.

    The code coverage activity that we record can be any interaction with Microsoft Dynamics NAV, be it manual user interaction, automated test execution, NAS, Web services, etc. You can, of course, record code coverage of your own tests exercising your own objects.

    The toolset includes a page (130002), Code Coverage List, which you can use to track code coverage. Run the page from the development environment:

    From this page you can start/refresh/stop the code coverage recorder. If you click the Start action, the code coverage engine is turned on and code coverage is captured. However, you will not be able to see any updated information before you click either Refresh or Stop, at which time you are presented with the code coverage information:

    The information contains coverage of objects, triggers/functions and individual lines (code and empty) as determined by the column Line Type. Only lines of type Code can have coverage. Lines of type Trigger/Function show the average coverage of all code lines in the trigger/function. Lines of type Object show the average coverage of all code lines inside the object.

    From the above picture, you can read that the activity exercised 33.93% of the Currency table (4). It covered 100% of the OnModify trigger and that comes from 100% of a single Code line.

    It is often desirable to filter on Line Type = Object to first get a high-level overview of the coverage result:

    Then from here, you can filter to look at individual objects and expand the Line Type filter to include triggers/functions as well:

    This way you can drill-down into the results starting from a high-level view going to a low-level view.

    Note #1: Code coverage is recorded globally for all sessions when using this tool, so make sure you are running in a controlled environment so you don’t have any activity from unaccounted sessions.

    Note #2: Only objects that are touched by the activity are recorded, meaning the coverage of any object not in the list is implied to be zero. If you would like to force the recorder to include specific objects even if they are not covered, you can use
    the Load objects action and select the relevant objects from the subsequent page. This forces the code coverage engine to load these objects and provide information on even when no lines are covered.

    Test Selection

    Now that we have all the building blocks in place, I’d like to talk about an advanced feature we included with the tooling.

    As mentioned previously, having to wait hours to run all tests is not feasible from a development point of view. Therefore we shipped the Test Selection, which helps you narrow the set of tests down to the relevant tests.

    The feature works by analyzing the code coverage data from individual test codeunits and comparing it to the set of objects that have the Modified field set to Yes in the database.

    To use this feature, you run the Test Tool page and go to the Actions tab and click Import/Export Test Map action. On the request page, make sure the direction is Import and click OK. Browse to the bundled "AppTestToolsetNAV2013-<country
    code>-Map.txt" file and import the file. This will take a couple of seconds. After it is done, click the Get Test Codeunits action. The prompt will now include a third option:

    Select this third option and the tool will automatically detect the relevant tests to run and add them to your current suite.

    Note #1: In typical circumstances you would want to make sure your suite is empty before using this feature.

    Note #2: There is a small risk this feature will not identify all relevant tests in unusual circumstances. Thus we strongly recommend running the full regression suite before shipping anything to the customer.

    This feature also integrates with you own tests. Once enabled (by loading the Test Map), the information will auto-update when any test is run – including your own tests. This means that you can load the map, run your tests and now export the map to another text file. You can then load the new map into another database and the test selection feature will now be able to suggest your own tests based on modified objects in this other database. If your test codeunit is not present in the database, you will be prompted with a list of missing test codeunits that could not be added. Import the missing test codeunits into the database and re-run the test selection feature.

    Additional Resources

    -Simon Ejsing


    Impact of Classic Runtime Stack Removal on Third-Party Tools

    $
    0
    0

    In Microsoft Dynamics NAV 2013, the Microsoft Dynamics NAV Classic Client has been renamed to the Microsoft Dynamics NAV Development Environment. We did this not only because we like long product names but because we want to call attention to the fact that the development environment is now specifically repurposed to developers only. C/SIDE (Client Server Integrated Development Environment) is no longer an end-user client and we’ve made several changes to the client to support its role. C/SIDE has always been the development environment and it’s been the jumping off point for third-party developer tools, both commercial and community sourced, and we know that the success of Microsoft Dynamics NAV 2013 is going to hinge on those tools continuing to work for you.

    But if C/SIDE no longer supports running C/AL code, then how can we make that claim?

    Before jumping in, I’d like to share a couple of thoughts about where we’re going with the development environment and a few thoughts about the different types of solutions that we’ve seen.

    First, the Application and Client have moved to the Windows Forms Client/NAV Server with the Microsoft Dynamics NAV 2009 release and we’ve had several releases since then. In the meantime, C/SIDE has both stayed as the (almost) single development environment and slowly evolved to its position in Microsoft Dynamics NAV 2013 where it’s no longer an end-user solution. We’ve talked a lot about moving the developer tools to the RoleTailored client, mainly because we think it’s a good fit to have the developer tools as part of the client (or part of the solution environment, however you want to think of it) and we think it’s a good fit because the UX offered by the RoleTailored client is a good match for a development environment. Task pages allow you to work atomically, FactBoxes allow us to share related information, Actions are good for functions, and so on and so forth.

    In Microsoft Dynamics NAV 2013, we made an explicit decision to not build the developer tools in the RoleTailored client and to instead focus on the following areas:

    • Improved debugger. This was a #1 priority as without Forms and the Form Transformation Tool based development process, there was no realistic way to debug a solution.
    • Improved page development. This was also a very high priority. The lack of WYSIWIG support for UI development was often cited as a reason for low uptake of the RoleTailored client.
    • Improved upgrade. We knew there would be lots of changes in Microsoft Dynamics NAV 2013, including breaking old file formats, and we needed to make sure we assisted people rather than putting up more roadblocks.

    In Microsoft Dynamics NAV “8” and beyond, we see new areas that we didn’t think of when we were starting Microsoft Dynamics NAV 2013 and we’re likely to give them equal consideration along with the developer environment. We look at the requirements brought by Azure and the desire to scale. We ask the question -- how does the deployment of a FOB file work when you have 1000 running instances? We also see the value in personalized clients and know that the user personalization settings are orthogonal to the Page in FOB/Text format – how are we going to solve that problem? We are also constantly challenged by questions about source control and other Microsoft product integrations (like Visual Studio based solutions & TFS) and we will factor those thoughts into our planning alongside the main goal of moving all development into the Microsoft Dynamics NAV Server stack.

    But back to the Microsoft Dynamics NAV Development Environment and using the developer environment with third-party tools or extensions. To review a couple of basics, the following have been implemented in Microsoft Dynamics NAV 2013.

    • Native Database Format has been removed. Microsoft Dynamics NAV 2013 is SQL only. Furthermore, the FBK format is still supported but only Microsoft Dynamics NAV 2013FBK files can be restored in Microsoft Dynamics NAV 2013.
    • Forms are not supported and have been removed as application objects and removed in the platform. (You can’t just import your old ones – they won’t work.)
    • Dataports are not supported – use XMLPorts.
    • C/AL code no longer runs in C/SIDE. Using the Run button in Object Designer will cause that object to run on the Microsoft Dynamics NAV Server instead.

    After reviewing these points, we saw that in Microsoft Dynamics NAV 2013 we needed to port any third -party tools to be Page object based. If the tool uses C/AL, then pages and code have to be validated on the Microsoft Dynamics NAV Server.

    We’ve seen several categories of ‘external tools’ and I’ll share those and their migration story here.

    • Pure Vanilla Utility. Pure Form/C/AL utilities exist and can be ported safely to being Page/C/AL based. Examples are small tools for updating data in the database or for generating statistics on data in the database. We recommend that in your Microsoft Dynamics NAV R2 version you run the R2 Form Transformation Tool to generate the pages and then upgrade those to Microsoft Dynamics NAV 2013.
    • Integrated Tool using C/AL Compilation as a sub-step. These tools are similar to Pure Vanilla Utilities and can be ported to being based on pages. The notable exception is when the tool should validate an object using compilation, then a change must be made to use the Microsoft Dynamics NAV Development Environment command line interface. The development environment now offers Import/Export and Compile object as command line functions, which can be used to trigger compilation and get error information via a log file.
    • External Tools relying on alternative interfaces/data sources. Some tools have been built on platform features like Client Monitor that have not been ported to the Microsoft Dynamics NAV Server. Other tools have relied on unsupported
      interfaces like nsobjectxproxy.dll to fish for event information from the object designer. Microsoft Dynamics NAV 2013 ships with nsobjectxproxy.dll and an alternative implementation of Client Monitor where the information is stored in the SQL Trace Logs. These tools are likely to require significant redesign work to run in Microsoft Dynamics NAV 2013.

    For many people, the command line options offer an interesting new entry point for the Microsoft Dynamics NAV Development Environment. Examples are shown here:

    finsql.exe command=compileobjects, servername=<server>, database=<database>, [filter=<filter>], [logfile=<path and filename>], [username=<user name>], [password=<password>], [ntauthentication=<yes|no|0|1>]

    finsql.exe command=importobjects, file=<importfile>, servername=<server>, database=<database>, [logfile=<path and filename>], [importaction=<default|overwrite|skip|0|1|2>], [username=<username>], [password=<password>], [ntauthentication=<yes|no|1|0>]

    Thanks for reading and thank you for your continued interest in Microsoft Dynamics NAV 2013 and developer tools. With your feedback, we can make future versions of Microsoft Dynamics NAV even better. I’m always interested to hear your thoughts and suggestions. My email is sglasson@microsoft.com and I wish you productive business.

    -Stuart Glasson

    NAV Design Pattern of the Week: Blocked Entity

    $
    0
    0

    Describing design patterns is something that has been done for various programming languages and business areas over the years. As well, Microsoft Dynamics NAV developers are using recurring solutions to solve common problems which are specific to enterprise resource planning software and even more, they are specific to the NAV inner workings. We are giving a name and documenting some of those reusable paradigms. Although the solution offered does not pretend to be the only or the best one at all times, we hope it will provide a good start for more technical discussions, ideas and knowledge sharing. It also offers ready-made recipes which can be reused during implementations. 

    The Reusable Dynamics NAV Patterns is a joint initiative and I would like to thank the NAV partners who have been here at Microsoft Development Center Copenhagen for the NAV Patterns workshop: Eric Wauters, Gary Winter, Mark Brummel and Claus Lundstrøm. We are looking forward to see more NAV Patterns being published from their side. Furthermore, this is an open initiative to anyone who has documented design patterns which are specific to NAV, please reach back to us either by leaving a comment here, or by writing to us. This being said, find below the NAV design pattern of the week.

    Blocked Entity

    Meet the Pattern

    The Blocked Entity pattern is used when it is required to stop transactions for an entity (mostly master data), temporarily or permanently.

    Know the Pattern

    In this pattern, the business entity holds a state that controls if a given transaction is allowed. The state is used by the logic controlling transactions.The change of state could either be temporary or permanent.

    An example of a temporary halt is when a retail chain selling items has received lot of complaints about an item, and the company wants to stop all transactions, both purchase and sale, with that  item until the dealer has clarified the issue with his supplier and possibly received a replacement for the defective stock. Another common example is during counting the physical inventory using cycle counting where the counting is done in one section of a warehouse at a time, so that the regular operations can continue in the other parts of the warehouse. In these situations, it is necessary to block all transactions, such as picks and put-aways, for a bin while warehouse counting is in progress for that bin.

    In contrast, a permanent halt to transactions could be required when an item has become obsolete (or is about to become obsolete), and the company wants to stop further purchase or sale of the item. However, the company wants to maintain the transaction history of the item and, therefore, does not want to delete the item record.

    A simple design implementation of such requirements in Microsoft Dynamics NAV is to add a Blocked field in the entity table (and on the associated page). The implementation takes this state into the logic and checks for the value of this field in related transactions. For most simple scenarios, it is sufficient to have two states on the Blocked field, specifying whether it is allowed to perform transactions for the entity or not.

    In certain situations, however, there could be different levels of blocking. For example, the company could block all sales to a customer that has overdue payments, and the company does not want to allow transactions with this customer until the payments are received. In other situations, the customer may have raised objections about an invoice, and the company has decided not to generate new invoices for the customer until the issue has been resolved. However, the company does want to continue shipping goods to the customer so as not to impact the customer’s operations. In these scenarios, it may be necessary to have multiple states on the Blocked field depending on the level of restriction that is needed.

    Use the Pattern

    As mentioned in the previous section, there are two implementations of this pattern depending on business requirements: The 2-state Boolean field for simple implementations and the multi-state option field for more complex requirements. The implementation flow is similar for both patterns, except how the validation is implemented. The following discusses the two scenarios one by one.

    Boolean Implementation

    Add a Boolean field named Blocked in the table.

    In the relevant logic, add a condition to check the status of the Blocked flag. The cheapest way is to use a TESTFIELD:

    <rec variable>.TESTFIELD(Blocked,FALSE);

    Alternatively, you can throw a custom error message. However, you should only do that if the default error message thrown by TESTFIELD is not sufficient.

    Option-Field Implementation

    Add an option field named Blocked in the table. The option values will reflect the different blocked states required by the company.

    Add this field on the card page (or on the List page if the entity does not have a card). As with the Boolean implementation, the convention is to add this field in the right-hand column in the General FastTab of the card page.

    Implement a function in the table that takes the transaction context as input and evaluates the Blocked field to decide whether the transaction should be allowed or not. Optionally, the function can be responsible for notifying the user and bubble up an error message straight away.

    Example

    Boolean Implementation

     

    An example of the Boolean implementation on the Item card.

    In codeunit 22 – Item Jnl.-Post Line, the following lines of code have implemented a check based on the value of the Blocked field:

    IF NOT CalledFromAdjustment THEN

    Item.TESTFIELD(Blocked,FALSE);

    Option-Field Implementation

    An example of the option field implementation on the Customer card.

    The CheckBlockedCustOnDocs and CheckBlockedCustOnJnls functions in the Customer table are responsible for validating the Blocked state with respect to the input document type. These functions are invoked in several areas, such as posting routines, where a status check on the Blocked field is required. This is a good practice where the Blocked implementation gets more complex, as this encourages reuse and ensures uniformity of implementation.

    NAV Usages

    Entities where the Blocked Entity pattern has been implemented include:

    • Item
    • G/L Account
    • Customer
    • Vendor
    • Bin

     

    Best regards

    Bogdana Botez and the Reusable Dynamics NAV Patterns team

    NAV Design Pattern of the Week: Copy Document

    $
    0
    0

    The Reusable Dynamics NAV Patterns is a joint initiative between the NAV team NAV partners. This is an open initiative to anyone who has documented design patterns which are specific to NAV, please reach back to us either by leaving a comment here, or by writing to us. This being said, find below the NAV design pattern of the week.

    Copy Document

    Meet the Pattern

    The goal of the Copy Document pattern is to create a replica of an existing open or closed document (posted or not posted), by moving the lines and, optionally, the header information from the source document to a destination document.

    Know the Pattern

    Documents are widely used by most of our customers. Many times, a significant portion of these documents are similar to each other, either by sharing the same customer, vendor, type, or line structure. Being able to re-use a document as a base for creating a new one is therefore an important means of saving time.

    Other business scenarios require that a newly created document is applied to an existing document. For example, in returns management, a return order can be the reversal of an existing order and can therefore be copied from the original order. Other times, there is even a legal requirement to match the document to its source. For example, credit memos need to be applied to the originating Invoice.

    For these reasons, NAV supports the copying of documents as a method to re-use or link documents.

    The Copy Document functionality is used in the following situations:

    • The user wants to create a new open sales document (Quote, Order, Blanket Order, Invoice, Return Order, Credit Memo) based on an existing posted or non-posted sales document (Quote, Blanket Order, Order, Invoice, Return Order, Credit Memo, Posted Shipment, Posted Invoice, Posted Return Receipt, Posted Credit Memo).
    • The user wants to create a new open purchase document (Quote, Order, Blanket Order, Invoice, Return Order, Credit Memo) based on an existing posted or non-posted purchase document (Quote, Blanket Order, Order, Invoice, Return Order, Credit Memo, Posted Shipment, Posted Invoice, Posted Return Receipt, Posted Credit Memo).
    • The user wants to create a new production order (Simulated, Planned, Firm Planned or Released) based on an existing production order (Simulated, Planned, Firm Planned, Released or Finished).
    • The user wants to create a new assembly order based on an existing assembly document (Quote, Blanket Order, Order and Posted Order).
    • The user wants to create a new service contract or quote based on an existing service contract or quote.
    • The user wants to create all relevant return-related documents. For example, from a sales return order, the user can recreate the involved supply chain documentation, by copying the information upwards to a purchase return order (if the items need to be returned to the vendor), purchase order (if the items need to be reordered), and sales order (if the items need to be re-sent to the customer).

    Note

    • Not all to and from combinations are allowed. For example, you can only copy to open document types, since the posted documents are not editable.
    • The destination document needs to have the header fully created. For example, a Sales Order will need to have the Sell-To Customer No. populated.

    Use the Pattern

    The Dynamics NAV application developer can take into account using the Copy Document design pattern when they have requirements such as:

    • To provide a quick and efficient way of moving content from a document to another.
    • To allow reusing the document history as a template for new documents.
    • To allow linking of documents that need to be applied to each other.

    The Copy Document pattern involves the following entities:

    1. Source document tables for document header and line. For example,Sales Header/Line.
    2. Destination document tables for document header and line.

      Note:The source document header/line and destination document header/line tables do not need to be the same. For example, you can copy a Sales Shipment Header/Lines into a Sales Header/Lines.

    3. Copy Document engine: COD6620, Copy Document Mgt.
    4. Copy Document report for a specific document type. The report requires the following parameters:
      • Source Document Type
      • Source Document No.
      • Include Header (optional)
      • Recalculate Lines (optional)
      Example: REP901, Copy Assembly Document

    Usage Sequence

    Precondition: The user creates a new destination document Header, filling up the required information.

    Step 1: The user runs the Copy Document report (element no. 4), filling up the parameters:

    • Source Document Type
    • Source Document No.
    • Include Header and/or Recalculate Lines (not all Copy Document reports have these).

    Step 2: The report copies the information in the source tables (Header and Line) into the destination tables (Header and Line).

    Post processing: The user performs additional editing of the destination document.

    The sequence flow of the pattern is described in the following diagram.

    Example: Copy Sales Document for Credit Memos.

    In the standard version of Microsoft Dynamics NAV, the Copy Document functionality is implemented in the Sales Credit Memo window as shown in the following section.

    ***

    Precondition: The user enters data in PAGE44, Sales Credit Memo.

    Step 1: The user runs REP292, Copy Sales Document from the Sales Credit Memo window, populating the required parameters. The Include Header and Recalculate Lines fields are selected.

    Step 2: The Sales Credit Memo window is populated with information from the source sales document.

    Post processing: The user can now do additional editing of the sales credit memo.

    NAV Implementations

    1. Copy Sales Document (REP292)
    2. Copy Purchase Document (REP492)
    3. Copy Service Document (REP5979)
    4. Copy Assembly Document (REP901)

     

    Best regards

    The Reusable Dynamics NAV Patterns team with special thanks to Bogdan Sturzoiu

    NAV Design Pattern of the Week: Multilanguage Application Data

    $
    0
    0

    The Reusable Dynamics NAV Patterns is a joint initiative between the NAV team NAV partners. This is an open initiative to anyone who has documented design patterns which are specific to NAV, please reach back to us either by leaving a comment here, or by writing to us. This being said, find below the NAV design pattern of the week. (Yes, it's late in the week, but still!)

    Multilanguage Application Data

    Meet the Pattern

    Generally, NAV translation refers to the translation of UI elements like captions and user texts (messages on dialogs, warnings, error messages). This translation is done by the Microsoft Dynamics NAV team before releasing the localized version of the product.

    But there is one more scenario. In this scenario, Cronus International Ltd., wants to sell a "Fiets" to a Dutch customer, a"Cykel" to a Danish one, and a "Bicicletta" to an Italian customer. All 3 are the same inventory item - and its default name is "Bicycle". But for reporting, Cronus International Ltd. wants to use the customer language preferences for translating the bicycle's name.

    Sometimes there's a need to support multiple languages for domestic transactions, too. For example, Switzerland has 4 official languages: German, French, Italian and Romansh, the first 3 of them being supported by NAV.

    Know the Pattern

    The example below uses the Item Translation feature of NAV, however, implementations of the same pattern exist for other application areas.

     

    How to use the pattern

    Enter translations for "Bicycle"

    In the Windows client, on the bicycle Item card, on the Home ribbon tab, choose Translations.

    On the opened page, enter the Danish (language code DAN), the Italian (ITA), and the Dutch (NLD) translations for "Bicycle".

    Set the desired language for the Dutch, Danish, and Italian customers

    On the customer card for your 3 customers, in the Foreign Trade FastTab, choose the preferred language for each customer. If no language is specified, then the default item description will be used for items sold or anyhow associated to that customer. If for example, the DAN (Danish) language is specified for the customer, and the "Bicycle" has a translation in Danish, then this translation "Cykel" will be used instead of the default name "Bicycle".

    See the result

    After those changes, when the customer (in this case the Danish "Lauritzen Kontormøbler A/S") transactions a bicycle, the translated description "Cykel" will be displayed on the documents and reports. For example, creating a sales order for this customer with 1 item No. 1000, shows:

     

    Implement the Pattern

    Create the translation table

    Named "<Entity> Translation" table, where <Entity> is replaced with the name of the actual object being translated. For the Item example above, this table will be named "Item Translation".

    The table definition contains at a minimum:

    FieldDescription
    Entity ID fieldFor example, Item No.
    Language CodeIdentifies the language of this translation string (for example, "DAN" (Danish), "BGR"(Bulgarian). This is one of the language codes defined in the Languages table.
    TranslationThe translated string.

     The table above has a key composed of the first two fields.

    Create the user interface for entering new translations of <Entity>

    • Create a Translations page to present the table created above
    • On the Entity card -  add a Translations menu option which will open the Translations page

    NAV Usages

    Some of the NAV implementations of this pattern are:

    1. Item Translation
    2. Payment Term Translation
    3. Shipment Method Translation
    4. Unit of Measure Translation

    Related Patterns

    The Extended Text pattern is a more powerful version of the Multilanguage application data pattern presented in this section. The main differences are: 

    PatternMultilanguage Application DataExtended Text
    Supports translation of application dataYesYes
    FormatSingle- or multi-lineSingle-line
    Applies to document typeCan choose which document types are affected.All document types are affected.

     

    Best regards

    The Reusable Dynamics NAV Patterns team

    NAV Design Pattern of the Week: Single-Record (Setup) Table

    $
    0
    0

    The Reusable Dynamics NAV Patterns is a joint initiative between the NAV team NAV partners. This is an open initiative to anyone who has documented design patterns which are specific to NAV, please reach back to us either by leaving a comment here, or by writing to us. This said, it's almost weekend, so here you have the NAV design pattern of the week.

    Single-Record (Setup) Table

    Meet the Pattern

    This pattern is intended for storing information about the operating setup or environment in the database, in a way that can be persisted across sessions. To facilitate this, this information is stored in a table with one record only. The user is subsequently able to modify, but not add or delete records in the table.

    The most common implementation of this is in the NAV Setup tables.

    Use the Pattern

    Implementation of the pattern involves 3 considerations:

    • Defining a suitable primary key.
    • Creating a page where the user can view and edit a record, but not add new records or delete an existing one.
    • Optionally, create Company - Initialize codeunit.

    Defining a Primary Key

    Since this kind of tables is a collection of several environment or setup parameters, the primary key does not refer to any business attributes for this kind of tables. However, for maintaining the integrity of the database, it is necessary to define a primary key.

    So, the most common implementation is to have a field "Primary Key" of Code[10]. This is populated with a blank value when the record is inserted. This field is not added to the page, so that the user cannot be modify it later.

    Creating a Page

    The CardPage type is most suitable for representing this kind of tables. In addition, the InsertAllowed and DeleteAllowed properties in the page should be set to false to prevent the user from adding or deleting records in the table.

    In the OnOpenPage trigger, the following code should be added to insert a record when the user opens the page for the first time, if a record does not exist already.

      

    Company-Initialize Codeunit

    The Company-Initialize codeunit (codeunit 2) is executed when a new company is created. We recommended that you add records to the single-record tables in this codeunit. If some of the fields are expected to have default values, they can also be populated here.

    NAV Usages

    Several Setup tables in NAV implement this pattern. Some of those are:

    1. Table 98 General Ledger Setup
    2. Table 311 Sales & Receivables Setup
    3. Table 312 Purchases & Payables Setup
    4. Table 313 Inventory Setup
    5. Table 242 Source Code Setup

    Note: While most tables just insert a record with empty primary key in codeunit 2, table 242 ("Source Code Setup") offers an example of inserting default values into all fields of the table (method "InitSourceCodeSetup"). This practice, wherever feasible, is likely to reduce the effort during implementation.

     

    Best regards

    The Reusable Dynamics NAV Patterns team

    Viewing all 773 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>