Code Monkey home page Code Monkey logo

jpoehls / dotnetmigrations Goto Github PK

View Code? Open in Web Editor NEW
36.0 6.0 10.0 2.84 MB

DotNetMigrations is a database migration framework that assists in managing and versioning database changes. It was originally designed as a straight port of the rails functionality located in the Ruby on Rails framework; however, has recently moved into a more open framework thanks to the power of the Managed Extensibility Framework.

License: Other

C# 97.43% Batchfile 0.35% PowerShell 2.22%

dotnetmigrations's Introduction

 _____        _   _   _      _   __  __ _                 _   _                 
|  __ \      | | | \ | |    | | |  \/  (_)               | | (_)                
| |  | | ___ | |_|  \| | ___| |_| \  / |_  __ _ _ __ __ _| |_ _  ___  _ __  ___ 
| |  | |/ _ \| __| . ` |/ _ \ __| |\/| | |/ _` | '__/ _` | __| |/ _ \| '_ \/ __|
| |__| | (_) | |_| |\  |  __/ |_| |  | | | (_| | | | (_| | |_| | (_) | | | \__ \
|_____/ \___/ \__|_| \_|\___|\__|_|  |_|_|\__, |_|  \__,_|\__|_|\___/|_| |_|___/
                                           __/ |                                
                                          |___/                                 

DotNetMigrations - http://github.com/jpoehls/dotnetmigrations

DotNetMigrations is a database migration framework that assists in managing and versioning database changes. It was originally designed as a straight port of the rails functionality located in the Ruby on Rails framework; however, it has since grown wings and taken its own path in several areas.

View our CI builds on the CodeBetter TeamCity server.

Available on NuGet at http://www.nuget.org/List/Packages/DotNetMigrations

Build prerequisites

  • PowerShell 2.0
  • .NET Framework 4.0
  • Git must be installed and in your path

Release checklist

  1. Make sure the $public_version number is correct in the .\build.ps1 script.
  2. Make sure the changelog is up-to-date in the .\readme.md file.
  3. Commit any changes up to this point.
  4. Tag the release in Git with git tag vX.Y.Z where X.Y.Z is the $public_version from the build script.
  5. Run .\build.bat.
  6. Add the hash of the tagged release from Git to the changelog entry for the release.
    Use this to get the hash: git log -n1 -r "vX.Y.Z" --pretty=%H
  7. Commit the changelog update.
  8. Look in the .\@artifacts folder for the goods.
  9. Run the .\@artifacts\PublishNuGetPackage.bat script to publish to the NuGet Gallery.
  10. Uploaded the zipped binaries from .\@artifacts to Github.
  11. Spread the word!

Contributors

git log --format='%aN' | sort -u

  • Darren Cauthon
  • James Eggers
  • Joshua Poehls
  • Kieran Benton

Changelog

  • 0.85 (2011-08-27) 6ff0a1887128dded5533441be4b1de4658e6e223

    This releases fixes the breaks in versions 0.83 and 0.84.

    WARNING! You must manually modify your [schema_migrations] table in order for this release to run. You need to make the following modifications.

    1. DROP the primary key constraint from the [schema_migrations].[version] column.
    2. Run this SQL: ALTER TABLE [schema_migrations] ADD [id] INT NOT NULL IDENTITY(1,1) CONSTRAINT [PK_schema_migrations] PRIMARY KEY

    A new [id] column will be added and used to locate the max version number in the migration table.
    This works because migrations are always inserted in sequential order so the IDs will also be in the correct order.

    This release also removes support for upgrading from the legacy [schema_info] table used by very early versions of DotNetMigrations. This really shouldn't matter to anyone at this point but if it does then just run an older version of DNM to perform the upgrade and then switch to the latest DNM release.

  • 0.84 (2011-08-27) 4834af1d7a41be0175083afcca5231b8dcb1713f

    • First release to be published to NuGet! (thanks Darren Cauthon!)
  • 0.83 (2011-08-26) a86b7318122251fe54930f2ebce4155b466ac1a5

    WARNING! This release breaks support for the uct_time and local_time versioning strategies. Only seq_num will work. If you use one of the time based strategies wait for v0.85 before upgrading.

    • Added preliminary support for Mono. (Thanks James Eggers!)

    • Fixed issue #21 with migrations not working past version 9.

    • Breaking change! The ##DNM:PROVIDER## token is now /*DNM:PROVIDER*/ The new token format will help ensure that your SQL scripts will run without errors outside of DNM.

    • Added support for a CommandTimeout parameter in the connection strings. This value will be used as the DbCommand.CommandTimeout value when the migration scripts are executed.

      Upgrade Notes:
      You must manually run the following SQL against your database to ensure issue #21 is fixed.

      ALTER TABLE [schema_migrations] ALTER COLUMN [version] [int] NOT NULL

      *Note that this is the change that will break the utc_time and local_time versioning strategies since timestamps are too large to fit into an [int] column. If you are not using seq_num then DO NOT run this alter command and your migrations should continue to execute like they did in the previous version.

  • 0.82 (2011-04-12) 5efb6c28dfdfd4a43e3014985ae9d8a74e1cb5e0

    • Added support for post migration actions that plugins can use to inject functionality that should run after a migration completes. (This was added to support the new PersistentMigrate [pmigrate] command in the dotnetmigrations-contrib project.)
  • 0.81 (2011-04-11) f762dce9614150a7e159238a8860ef9b76fb82d2

    • New seed command that executes scripts in the .\seeds\ folder.

    • New setup command that migrates a database to the latest version and executes scripts in the seed folder.
      This is the same as running:

          > db migrate myConnection
          > db seed myConnection
      
  • 0.80 (2011-03-30) 896915d7a75df1c4939fbcc4b01bc0efe3cbadf4

    • New sequential number seq_num versioning strategy

    • Added support for a ##DNM:PROVIDER## token in migration scripts that is replaced with the connection string's provider name.

    • Added support for migration scripts without placeholders. When the script doesn't have any BEGIN_SETUP, END_SETUP, BEGIN_TEARDOWN, or END_TEARDOWN tokens then the entire script is assumed to be a Setup with no Teardown.

  • 0.70 (2010-08-28) 950f6d5a6ca99a175a4d176208d3cdd08f52dd80

    This release brings yet another significant rewrite. Building on the MEF integration and innovations of 0.6, many improvements have been made to make writing commands easier and safer.

    • New 'connections' command that provides a command line interface for viewing, adding, editing and removing stored connection strings in the config file.

    • Revamped help system that provides detailed info on the usage of all commands, including custom commands.

    • Smarter, more robust parsing of migrations scripts that contain GO keywords. Special thanks to the Subtext project who we borrowed these improvements from!

    • New strongly-typed parsing of command line arguments with support DataAnnotation attributes for validation and automatic integration with the help system.

    • New CommandBase and DatabaseCommandBase classes for building custom commands.

    • Rewritten data access routines that wrap all bulk operations in transactions. Also much smarter database connection handling.

    • Beefed up unit tests suite with full coverage of all critical routines.

    • Changed unit tests to use SQL Server CE 3.5.1 instead of SQL Server Express. This means anyone can run the unit tests now without having to setup a database first.

    • Major updates to the build script to use Psake instead of NAnt to run the unit tests after compilation and improvements for integrating with TeamCity.

    • Lots of bug fixes.

  • 0.60 (2010-01-28)

    This version has been completely rewritten from the ground up using .Net 3.5 and the Managed Extensibility Framework.

    • Completely new directory structure and codebase

    • MEF has been used to assist in the inner workings of the application as well as allow for new logs and commands to be created.

    • Unit Tests Project has been added - tests use the NUnit framework

    • An automated build file has been created for NAnt v0.86 b1

    • Core project has been created to allow for easy access to classes required for extending DotNetMigrations

    • Breaking Change! The BulkCopy command has been removed. The BulkCopy command has temporarily been removed from the bundled commands of the application. This command will be moved to the DotNetMigrations-Contrib project as soon as it's ready. If you use the bulkcopy command, please continue using v0.5.

      About the DotNetMigrations-Contrib Project
      Thanks to the Managed Extensibility Framework, DotNetMigrations is now able to have new commands and logging mechanisms to be created and added by anyone who wants. Because of this, the DotNetMigrations-Contrib project is being launched in the very near future (target launch date is Feb 01, 2010). The goal of this sister project is to provide a location for people to share new commands and logs to the application without having to worry about getting a different version of the core DotNetMigrations application from this project.

  • 0.50 (2008-05-21)

    • fixed bug where blank lines before a GO in the migration script would cause an exception to be thrown

    • refactored the code for each 'command' into individual classes that inherit from a base ConsoleCommand class.

    • fixed bug that caused a sql error to be thrown whenever the schema_info table did not exist

    • added the bulkload command

    • added return codes to the application

  • 0.40 (2008-05-08)

    • fixed bug where lines in the SQL migration script were being concatenated together without any whitespace to separate them
  • 0.30

    • uses DbProviderFactories to perform all database connections default provider is System.Data.SqlClient you can specify a specific provider by adding a PROVIDER= setting to your connection string ex. PROVIDER=System.Data.SqlClient;SERVER=(local);DATABASE=TEST123

    • misc updates made to specifically support SQL Compact 3.5 databases

  • 0.20

    • added the version command
  • 0.10

    • initial release!

License

Copyright (c) 2008-2011, Joshua Poehls
All rights reserved.

Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:

  * Redistributions of source code must retain the above copyright notice,
    this list of conditions and the following disclaimer.

  * Redistributions in binary form must reproduce the above copyright notice,
    this list of conditions and the following disclaimer in the documentation
    and/or other materials provided with the distribution.

  * Neither the name of DotNetMigrations nor the names of its contributors
    may be used to endorse or promote products derived from this software
    without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE
OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED
OF THE POSSIBILITY OF SUCH DAMAGE.


==================================================================================
Portions of the source were taken from the Subtext project and are covered
under the following license. These portions also contain this license in the
header of their source code files.
==================================================================================

Copyright (c) 2005 - 2010, Phil Haack
All rights reserved.

Redistribution and use in source and binary forms, with or without modification, 
are permitted provided that the following conditions are met:

    * Redistributions of source code must retain the above copyright notice, 
    this list of conditions and the following disclaimer.
    * Redistributions in binary form must reproduce the above copyright notice, 
    this list of conditions and the following disclaimer in the documentation 
    and/or other materials provided with the distribution.
    * Neither the name of the Subtext nor the names of its contributors 
    may be used to endorse or promote products derived from this software 
    without specific prior written permission.

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND 
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED 
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. 
IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, 
INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, 
BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, 
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY 
OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE 
OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED 
OF THE POSSIBILITY OF SUCH DAMAGE.

dotnetmigrations's People

Contributors

darrencauthon avatar jameseggers1 avatar jpoehls avatar kieranbenton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

dotnetmigrations's Issues

Schema Hashing

Hash will include:

  • tables (catalog, schema, name)
  • table columns (name, default values, all data type information and optionally their order)
  • constraints (foreign key and primary key)
  • views (catalog, schema, name and definition)

Notes:

  • Add a [hash] column to the [schema_info] table to store a hash of the schema taken after the migration
  • On rollback, confirm that the schema hash matches the last migrations [hash] → if it doesn’t, throw a warning (or, optionally) throw an error and rollback the transaction. A non-match would indicate that either the Tear Down of the migration didn’t correctly return the schema to the previous state, or indicates that changes to the schema were made manually (and not by migration scripts).
  • On migrate up, confirm that the schema hash matches the last migrations [hash] before starting. A non-match would indicate the schema has changes since the last migration was applied. Throw a warning (or error).
  • It should be a config option whether schema mismatches throw warnings or errors (errors would also mean the migration transaction would be rolled back).
  • It should be a config option whether the order of columns in a table are included in the hash.
  • Evaluate the performance of calculating and comparing the hashes. Determine whether the perf hit is substantial enough to warrant including a switch to skip the hashing.

Questions:

  • What happens if you change the option whether to include column order in the hash after migrations have already been applied? This would mean the hashes already taken are all invalid.
  • Option 1: Regenerate the hashes by running a command similar to the ‘test’ command that rolls back the database and migrates it up inside of a transaction. This way we could regenerate all the hashes and then return the database to the original state when done. (This would have the same performance impacts as noted in the ‘test’ command.)
  • Option 2: Always store 2 hashes. One with column order and one without.
    How will we know when the column order hash option has been changed?

Resources:

Update Argument Structure

The current argument structure is based on convention and can get confusing really fast when arguments are optional. We may be converting to using a more named-argument convention instead of placement convention using Mono.Options.

New core "combine" command that will take a range of migrations and produce a single script that can be run separately (for DBAs)

This is important in large scale enterprise deployments. Although development teams can and should develop solely using migrations and making their own choices over whether to have local or shared development databases, it is vital that at the end of a sprint or as part of a release, a single packaged "upgrade" script can be provided to a DBA group to apply to staging and production systems.

It may be that eventually there is no requirement for this within a company, but certainly whilst migrating to more agile methods (including automatically run migrations) it is important to be able to build confidence with and interoperate with existing DBA requirements.

Discover connection strings in projects' web.config and/or app.config

If DNM is installed via NuGet (or otherwise installed in some predicable way alongside a solution) then we should be able to easily discover all of the connection strings stored in the projects' web.config and app.config files.

Pros:

  • User doesn't have to duplicate their connection strings in both their web/app config files AND in the DNM config file.
  • Feature can degrade nicely back to just using connection strings in the DNM config file if no other configs are found or if DNM isn't installed alongside a solution.

Cons:
?

Move DNM to .NET 4.0

With the recent release of .Net 4, a possible build of the source coding using the .Net 4 framework will be coming out soon. There are some benefits of using .Net 4; however, know that not everyone has the ability to upgrade their systems to the .Net 4 runtime right away. Because of this, the next build will probably be cross compiled. There has not been discussions on if and when 3.5 will be dropped.

Version command should list all scripts not applied to database

Currently the version command simply lists the latest migration script version # and the current schema version #. This was useful back when the version numbers were sequential, however now that the version numbers are UTC time stamps it isn't very useful to know the specific version of schema vs database. All that tells you is that the versions are different, it doesn't tell you whether just 1 migration is missing, or whether 100 are missing.

Proposed changes:

  • If the schema is up-to-date it should say so, and perhaps give just the latest version # that it is up-to-date to.
  • If not all migration scripts have been applied to the schema, then it should list the ones that are missing. If > 20? are missing, it should list the 20 newest that are missing and say "plus 100 more" to indicate more are missing as well. You should be able to run the command again with a switch to output all of them.
  • If there are migration scripts that have been applied to the database that aren't in your migrations folder, then the command should warn you that you seem to be missing migrations scripts and should list the ones that are missing with the same truncation rules as above.
  • If both conditions are true, i.e. you are missing migration scripts that have been applied AND you have scripts that haven't been applied to the database, then you should see both warning outputs.

Help Text Formatting

When the help text is displayed, words are broken and not wrapped and indented properly.

Change migration script token format

Change the ##DNM:VERSION## and ##DNM:PROVIDER## tokens to be /*DNM:VERSION*/ and /*DNM:PROVIDER*/ instead so that the tokens won't cause syntax errors when running the SQL scripts outside of DNM.

Bulkloading binary data

In the 0.5 version I added support for bulkloading binary data. Could you implement this in the latest version?

In DAL.cs I added

internal static object GetBinaryParameter(string name, string stringvalue)
{
Byte[] bytevalue;
if (string.IsNullOrEmpty(stringvalue))
{
bytevalue = null;
}
else {
if (stringvalue.StartsWith("0x"))
{
stringvalue = stringvalue.Substring(2);
}
bytevalue = new Byte[stringvalue.Length / 2];
for (Int32 i = 0; i < stringvalue.Length / 2; i++)
{
Byte.TryParse(stringvalue.Substring(i * 2, 2), System.Globalization.NumberStyles.HexNumber, null, out bytevalue[i]);
}
}
DbParameter p = GetFactory().CreateParameter();
p.ParameterName = name;
p.Value = bytevalue;
return p;
}

And in BulkloadCmd.cs I replaced

                        //  add values to INSERT statement
                        foreach (string columnName in rowParams.Keys)
                        {
                            sb.Append("@" + columnName + ",");
                            cmd.Parameters.Add(DAL.GetParameter("@" + columnName, rowParams[columnName]));
                        }

with

                        //  add values to INSERT statement
                        foreach (string columnName in rowParams.Keys)
                        {
                            sb.Append("@" + columnName + ",");
                            if (rowParams[columnName].StartsWith("0x"))
                            {
                                cmd.Parameters.Add(DAL.GetBinaryParameter("@" + columnName, rowParams[columnName]));
                            }
                            else
                            {
                                cmd.Parameters.Add(DAL.GetParameter("@" + columnName, rowParams[columnName]));
                            }
                        }

SecurityException

For deployment I've created a setup project (MSI) that installs DotNetMigrations locally. On older (Windows XP) machines this worked fine, but on machines with Windows 7 the application fails to initialize due to security exception when it is loading dmn.logs from the app.config. I fixed it by editing the last method in src\DotNetMigrations.Core\Provider\ConfigurableTypeCatalog.cs like so

private static ConfigurableTypeSection GetSection(string sectionName)
{
    Configuration config = ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.None);
    ConfigurableTypeSection section = config.GetSection(sectionName) as ConfigurableTypeSection;

    if (section == null)
    {
        throw new ConfigurationErrorsException(string.Format("The configuration section {0} could not be found.", sectionName));
    }

    return section;
}

Using the OpenEXECofiguration fixes the SecurityException. This workaround was found here: http://stackoverflow.com/questions/2725432/net-4-0-application-on-network-share-causes-securityexception

Support for other database objects (sprocs, functions)

Stored Procedures and Functions are not a good fit for inclusion in migration scripts. They are better off having their own script files. Consider adding support for a /functions and /storedprocs folders along side the /migrations folder. Either in the migrate command, or in a new command, execute all the CREATE scripts in those new folders.

  • Would be nice if we could detect which Stored Procs & Functions needed to be updated before running the scripts.
  • Scripts should be named with the object name so that we can handle dropping the object if necessary (ex. dbo.mySproc.sql)
  • Justification for keeping function & stored proc scripts out of the migration scripts: History. If each revision to a function and sproc is in a separate migration file you have no easy way to view the history of changes over time as you would with source code files using your VCS diff tools. Functions and stored procs are more closely related to source code routines than they are to table schema objects and this kind of diff / VCS functionality is important for them.
  • Handling rollbacks with non-versioned function & sproc scripts: would have to checkout specific revision of /functions and /storedprocs folders from source control for when migration script that you are rolling back to was created? This way the correct version of the objects would also be added. - *This is very hokey. We need a better way to version these CREATE scripts alongside the migration version without having to include them in migration setup/teardown scripts. *

Seed data should support CSV files

  • CSV files must have headers that match the table's column names.
  • The CSV file name must either match the table name completely (ex. MyTable.csv) or must contains the table name in brackets somewhere in the file name (ex. demo data for [MyTable].csv).
  • If the headers include the Primary Key column then turn identity insert on.
  • The CSV file name must include the table name in some parsable way.
  • Turn off CHECK CONSTRAINT before inserts and turn it back on afterwords.

Migrations don't work past the number 9 when using an incremental integer

I've been using indexing my migration scripts with an incremental integer. After migrating past my tenth script, the system stopped working because dotnetmigrations thought I was only on version 9.

The problem is the select statement here:

const string command = "SELECT MAX([version]) FROM [schema_migrations]";
https://github.com/jpoehls/dotnetmigrations/blob/master/src/DotNetMigrations.Core/BaseClasses/DatabaseCommandBase.cs#L73

Since the version field is nvarchar(14), "9" is considered greater than 10, 11, etc.

The fast fix is to manually change the property in the database to integer, but that doesn't help people who use this library. There are a number of ways to solve this issue, but since this logic is rooted deep in a base class I'm not sure which method should be used.

New "test" command

Create a new test command that

  • begins a transaction
  • rolls back the database to 0 (if needed)
  • migrates to the latest version
  • then rolls back to 0 again
  • rolls back the transaction (so that the database state is the same as before the test command was run)
  • If no hash warnings (or errors) are thrown during the test then it is deemed successful, otherwise the test fails and reports the specific migration whose TearDown triggered the warning.

This command could cause logs to fill up very fast in a large or production database. A warning of such should be shown and confirmed before running this command.

DNM library that can be embedded in your application to run migrations

The focus would be on making a library that could be referenced by an ASP.NET (or other) application. At the least, this library needs a public Migrate(string connectionString, string migrationsFolder) function that can be called to run migrations on the given database.

Use case would be a web application that auto-upgrades the database in the OnApplicationStart global.asax event.

This would greatly simplify deployments by removing the database update step. Just deploy the code and you're done.

This would be safer after the 'test' command has been created so that migration testing could be integrated into the CI tests.

Seed Data

DNM-Contrib? – New Seed Data function. Each seed data script should have a migration number that it gets applied at. When you migrate the database, the seed data scripts up to the current migration should be applied (if they weren’t already). This implies we need a new table in the database to track which seed data scripts have been ran. Also we need an OnMigrated() hook that fires after a migration is run so that our seed data runner can tie in and execute the required scripts. How do we handle rollbacks? Seed data really can’t be rolled back, does this matter?

Possible Custom Console

From James Eggers:

Was thinking about the changes that we have planned on wondering if having an inline and custom console mode of DNM may be beneficial. This would allow automation scripts while providing an easy way to work on the migrations themselves. This may be better suited for a further future release when more helper commands are available though.

Specify migrations folder from command line

It would be useful when writing batch files to be able to provide the path to the migration files from the command line.

As it is, it always looks for the path "migrate" from the current working directory.

First-class Mono support

I believe DNM already has a decent level of Mono support, however I am now at a point where I hate booting up my Windows VM and need to be able to maintain DNM from OS X. So the whole thing needs to support Mono first class.

Here are a few things I know needs to happen for this:

  1. Convert build script from PowerShell into something that will run equally well on Windows and Mono.
  2. Figure out if/how we can publish updates to the NuGet package from Mono.
  3. Update the TeamCity CI build to use the new build scripts.
  4. Update all applicable unit tests to use an in-memory Sqlite database instead of SQL Server Compact. SQL Server Company specific tests can continue to use it, but it should be possible to run the test suite (minus those tests) under Mono easily. The full test suite should be able to run on the TeamCity server.

Timeouts

Hi, I'm still working with the 0.5 version and had a problem with timeouts on scripts that took a while to complete. To fix that I added CommandTimeout = 0 where the DbCommands where being created.
Could you add this to the latest version of DotNetMigrations?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.