d365collaborative / d365fo.tools Goto Github PK
View Code? Open in Web Editor NEWTools used for Dynamics 365 Finance and Operations
License: MIT License
Tools used for Dynamics 365 Finance and Operations
License: MIT License
This statement fails on version 0.3.87
The Time variable is not present and the SQL fails.
If I use the following command:
Invoke-D365DBSync -DatabaseServer "DENUE--AX-QRA15" -DatabaseName "AXDB" -SqlUser "sqlAdmin" -SqlPwd "LokalCosm0123!" -LogPath "C:\AOSService\PackagesLocalDirectory_logs" -Verbose
I get the following error:
VERBOSE: CommandFile \bin\SyncEngine.exe
VERBOSE: Parameters -syncmode=fullall -verbosity=normal -metadatabinaries=""
-connect="server=DENUE--AX-QRA15;Database=AXDB;Trusted_Connection=false; User Id=;Password=;"
Start-Process : This command cannot be run due to the error: The system cannot find the file specified.
At line:88 char:16
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
VERBOSE: Process Started
Write-Verbose : Cannot bind argument to parameter 'Message' because it is null.
At line:93 char:19
Write-Verbose $process
~~~~~~~~
Time Taken for sync:
00:00:00.0019951
it seems to me that the sqlUser and the sqlPwd are not recognized
Hello,
in readme.md, you say that we can use the import-AadUser with this synthax :
Import-AadUser -Userlist "[email protected],[email protected]"
but the script failed because the value separator must be ";" and not ","
Import-AadUser -Userlist "[email protected];[email protected]"
in the powershell function, you split by using this command
$usersFromList = $UserList.Split(";")
Implement get cmdlet to handle the task of getting labels from label file.
http://www.agermark.com/2017/12/find-standard-label-translations.html?m=1
When I try to run
Import-D365AadUser -UserList $UserEmail -Verbose
I have got the following
...
VERBOSE: Import the user
VERBOSE: Adding User : [email protected],Glib Holovin,glib.holovin,S-1-19-290.....192,DAT,https://sts.windows.net/,https://sts.windows.net/
VERBOSE: Creating the user in database
VERBOSE: Rows inserted 1 for user [email protected]
VERBOSE: Setting security roles in D365FO database
Exception calling "ExecuteScalar" with "0" argument(s): "Violation of PRIMARY KEY constraint 'I_65492RECIDIDX'. Cannot insert duplicate key in object 'dbo.SECURITYUSERROLE'. The duplicate key value is (5637244459).
The statement has been terminated."
At line:12 char:5
+ $differenceBetweenNewUserAndAdmin = $sqlCommand.ExecuteScalar()
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : SqlException
VERBOSE: Difference between new user and admin security roles
Import-AadUserIntoD365FO : User [email protected] did not get securityRoles
At line:195 char:9
+ Import-AadUserIntoD365FO $SqlCommand $user.SignInName $name $ ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
Looks like SQL code is OK for AX2012, but for D365FFO you don't need anymore to retrieve the next RecId. Just insert a new row and SQL will automatically increment RecId field.
Before exporting the copied database, we need to delete tables that take up too much memory and are not required (history tables).
This reduces our Azure database by several GB.
I suggest a parameter called "Truncate" with a table list,
e.g. -Truncate Batch, BatchHistory, BatchJobHistory, DatabaseLog, SysDatabaseLog etc.
Create a cmdlet that has all special flush classes as a separate switch
SysFlushData
SysFlushAod
SysDataCacheParameters
It seems the import-command require a response from console in order to complete.
This needs a rewrite of the cmdlet.
It seems counter-intuitive to have to provide both backupdirectory when the filename already has the path.
Either filepath is enough, or backupdirectory provides the path and filename can be sent without path.
d365fo.tools/functions/new-d365bacpac.ps1
Line 114 in 9fc8345
Any database connection while on OneBox shouldn't depend on username and password (SQL User), but allow for Windows Authentication by simply omitting the SQL User credentials and letting it connect using TrustedConnection. That way, for many scenarios, we do not have to gather the SQL User credentials from LCS to get started with several database related operations on OneBox environments.
Functions relying on sql user password from AOS configuration, requires to be run as Administrator.
Functions should validate that a password is specified when it isn't running as Administrator
When you Import the database to Tier 1 from Tier2, according to MS docs (earlier version), you need to run a SQL script for prepare the target database with the correct credentials. Part of this step is to create the database users, with passwords from LCS. However, one can just as easily simply create the users from the existing SQL Server instance. This removes the dependency to having the passwords gathered from LCS.
So instead of:
CREATE USER axruntimeuser WITH password 'abc'
You can use:
CREATE USER axruntimeuser FROM LOGIN axruntimeuser
This is now reflected on the docs, and so should the SQL run in d365fo.tools
https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/copy-database-from-azure-sql-to-sql-server
:-)
Implement
get and set cmd let for below blog post
http://d365technext.blogspot.com/2018/07/offline-authentication-admin-email.html?m=1
Implement Enable-D365User
Please implement the ability to streamline the re-activation of an environment after database refresh from PROD to sandbox environment. We always have activated the users by execution SQL-Statements like this:
update userinfo set enable = 1 where ID = 'sys-ax-batch'
To get a similar “interface” for this new functions as it exists Update-D365User I would suggest to use the same parameters. So an sample call would be:
Enable-D365User -Email "[email protected]"
Enable-D365User -Email "%contoso.com%"
We plan on using the PSFramework to implement a simple solution to store default configuration details on the machine.
The thought is that the first time we install the module on a machine, the user could run Set-D365MachineConfig -Url XYZ -SqlUser XYZ -SqlPwd XYZ and have the module store these informations. When getting back and starting a cmdlet that needs these details the module would pick them up and you wouldn't have specify the parameters.
Add-NTFSAccess -Path C:\AOSService -Account administrator -AccessRights FullControl -AccessType Allow -InheritanceFlags ContainerInherit
Get-ChildItem -Path C:\AOSService -Recurse | Add-NTFSAccess -Account administrator -AccessRights FullControl -AccessType Allow -InheritanceFlags ContainerInherit
Add-NTFSAccess -Path 'C:\Windows\System32\inetsrv\config' -Account administrator -AccessRights FullControl -AccessType Allow -InheritanceFlags ContainerInherit
maybe we need to give access to these
C:\Windows\WinSxS\amd64_microsoft-windows-iis-sharedlibraries_31bf3856ad364e35_10.0.14393.1198_none_86ae9b2a0f5f134e\redirection.config
C:\Windows\WinSxS\amd64_microsoft-windows-iis-sharedlibraries_31bf3856ad364e35_10.0.14393.1613_none_8670fb260f8e5426\redirection.config
Take the powershell code already written for our customer project
Will require that Get-D365User outputs and email property (networkalias).
The switch-d365activedatabase should support Integrated Security for localhost scenarios (onebox, dev vm, tier1). No need to provide credentials when using Windows Authentication.
"Integrated Security=True"
If you have a SqlPwd like this "=B5*=ABCD**XXX" you will receive the following error when using New-D365Bacpac.
Exception setting "ConnectionString": "Keyword not supported: 'password=b5*'."
At line:38 char:5
$sqlConnection.ConnectionString = ($Params -join "")
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
We need the ability to add custom commands to ..\internal\sql\clear-azurebacpacdatabase.sql for the process of "Automate the creation of bacpac from a Tier 2 (MS hosted) environment" because we have to DROP FULLTEXT STOPLIST before export starts. I´m not that familiar with PowerShell, so I don´t know if it’s possible to implement “hook points”…
Currently I´m modifying the file after installation of the module.
When I was testing this, the TfsURI was not populated. Any idea why that is?
I had to explicitly add it.
As mentioned by valerymoskalenko in issue #36
Inserting records into D365FO, should use the Sequences used in SQL server instead of the old SYSTEMSEQUENCE Table.
The Sequence is named like SEQ_TABLEID
Implement Get cmdlet that pulls the json data from "/api/services"
When using Switch-D365ActiveDatabase and the database specified in NewdatabaseName does not exists, it does not rollback the rename of the original Database,
We should implement the ability to do a partial compile with the tool
If you want to use Import-D365Bacpac -ImportModeTier2 you will receive the follwing error:
Import-D365Bacpac : Parameter set cannot be resolved using the specified named parameters.
At line:1 char:1
+ CategoryInfo : InvalidArgument: (:) [Import-D365Bacpac], ParameterBindingException
+ FullyQualifiedErrorId : AmbiguousParameterSet,Import-D365Bacpac
Install-Module carbon -AllowClobber
Import-Module carbon
Grant-ServiceControlPermission -ServiceName Microsoft.Dynamics.AX.Framework.Tools.DMF.SSISHelperService.exe -Identity administrator
Grant-ServiceControlPermission -ServiceName DynamicsAxBatch -Identity administrator
Grant-ServiceControlPermission -ServiceName MR2012ProcessService -Identity administrator
Grant-ServiceControlPermission -ServiceName w3svc -Identity administrator
We should implement the get / set cmdlets to support the updating of the registry value for the cleanup job retention as mentioned here:
http://www.alexondax.com/2018/04/msdyn365fo-how-to-adjust-your.html
-RenameMachine
parameter to also rename the entire windows machine name. Enables multiple oneboxes to being able to work against VSTS.
While installing d365fo.tools with the command Install-Module -Name d365fo.tools I´m getting the error following error. This is a brand new 1611 VM with only PowerShell-Module installed for Visual Studio-TFS integration.
PackageManagement\Install-Package : A command with name 'Get-LastResult' is already available on this system. This
module 'd365fo.tools' may override the existing commands. If you still want to install this module 'd365fo.tools', use
-AllowClobber parameter.
At C:\Program Files\WindowsPowerShell\Modules\PowerShellGet\1.0.0.1\PSModule.psm1:1772 char:21
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
PSFramework can help us with having configuration files persisted to storage, that survives sessions over time.
It provides a powerfull message framework that can replace -verbose in a way were we get a historical verbose message implementation. It means we don't need to run the cmdlet again, to see what went wrong - it will be recorded even when running the cmdlet normally.
The following section needs to be changed:
It should be replaced with the target database name.
SELECT TOP 1000
N.[Id] AS XRefIdLabel
,N.[Path] AS XRefLabelPath
,N.[ProviderId] AS XRefLabelProvider
,M.Module AS XRefLabelModule
, R.SourceId AS CodeReferenceId
, N2.[Path] AS CodeReferencePath
, M2.Module AS CodeReferenceModule
FROM [DYNAMICSXREFDB].[dbo].[Names] AS N
INNER JOIN [dbo].[Modules] AS M ON N.ModuleId = M.Id
INNER JOIN [dbo].[References] AS R on N.Id = R.TargetId
INNER JOIN [DYNAMICSXREFDB].[dbo].[Names] AS N2 ON R.SourceId = N2.Id
INNER JOIN [dbo].[Modules] AS M2 ON N2.ModuleId = M2.Id
WHERE N.[Path] LIKE '/Labels/@SYS330300
After the SQL Select - use the details to find the xml files for the given object and show details. Support number of lines to show, and either before, after or both switch.
SysDataCacheParameters
We need to double check that all services are included in these cmdlets.
Microsoft.Dynamics.AX.Framework.Tools.DMF.SSISHelperService.exe
New-D365Bacpac -ExecutionMode FromSql -DatabaseServer localhost -DatabaseName AxDB -SqlUser sqluser -SqlPwd "pwd" -BackupDirectory J:\MSSQL_BACKUP\ -NewDatabaseName AxDB_EXPORTCOPY -BacpacDirectory J:\MSSQL_BACKUP\ -BacpacName bacpacname
This command fails as the -ExecutionMode parameter cannot be found. Is it possible that this has been renamed as I see ExportModeTier1 and ExportModeTier2 but when using these it does not find the parameter set?
http://d365technext.blogspot.com/2018/08/environment-hot-fixes-list-using-x.html
Starting point for relevant dll files to work against
Get-D365DotNetClass -PackageDirectory C:\AOSService\webroot\bin\ -Name "metadata" -Assembly "metadata" | Get-D365DotNetMethod -Name "DiskMetadataProvider"
If the bacpac file for whatever reason doesn't exist, the script acts like the file was actually imported (stating "import completed").
Maybe the cmdlet should throw if the file is missing, just as it throws if sqlpackage.exe is missing.
install-Module ntfssecurity -AllowClobber
import-Module ntfssecurity
Set-NTFSOwner -Path C:\ProgramData\Microsoft\Crypto -Account administrator
Get-ChildItem -Path C:\ProgramData\Microsoft\Crypto -Recurse | Add-NTFSAccess -Account administrator -AccessRights FullControl -AccessType Allow -InheritanceFlags ContainerInherit
When renaming a local VM (onebox) it breaks the VS debugging process.
A fix found by others:
Following files needed to be updated with correct spn id and url.
This is more of a developer preference thing, trying to avoid code that requires horizontal scrolling.
d365fo.tools/functions/import-d365bacpac.ps1
Line 217 in 1867a38
Replace ; with linebreaks instead. ;-)
Should be public and then we can implement different specific Invoke cmdlets. E.g. Invoke-FlushAodCache
http://www.agermark.com/2017/07/clear-aod-cache-including-sysextension.html?m=1
The script is testing the path as if it was a folder and not a file. This causes this script to fail in my testing.
Implement a fix for this:
https://ekprogramming.blogspot.com/2018/04/d365fo-data-management-change-tracking.html
When using this command on a onebox dev vm I get the following error message.
Cannot find an overload for "Replace" and the argument count: "2".
At line:2 char:5
(Get-Content $File).replace($OldValue, $NewValue) | Set-Content $ ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The message appears three times and IIS service is restarted and active URL typed (unaltered).
Do I use the tool in the wrong way?
Thanks
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.