Code Monkey home page Code Monkey logo

foxunit's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

foxunit's Issues

Wish: "Jump to error" or "Fix" button

When a test fails due to an exception it would helpful to have a "Jump to error" or "Fix" button that uses EDITSOURCE() to jump to the line of code that caused the error. Typically this error is not in the UnitTest, but in the tested class or some class that is used and not mocked. The button could be placed below the "Compare" button and only be enabled, when a test failed due to an exception.

The greatest obstacle to the implementation is FoxUnit's reliance on TRY...CATCH. A VFP exception only has method/procedure name and a line number. More detailed information such as the class library are not available and cannot be collected with ASTACKINFO() in the CATCH block as that is already after the fact.

FoxUnit could instead use the following alternative approach:

Do ____FoxUnit____Execute____TRY_CATCH_Handler

Procedure ____FoxUnit____Execute____TRY_CATCH_Handler
	Private all like *____FoxUnit____
	Dimension ____FoxUnit____AStackInfo[1]
	____FoxUnit____Message = ""
	____FoxUnit____Error = 0
	____FoxUnit____Line = 0
	____FoxUnit____OnError = On("Error")
	On Error ____FoxUnit____HandleErrorException (Message(), Error(), Lineno(1), Program(-1))
	Do UnitTest
	On Error &____FoxUnit____OnError
	Clear
	List Memory like ____FoxUnit____AStackInfo
	?
	? ____FoxUnit____Message
	?  "error #", ____FoxUnit____Error, "in line", ____FoxUnit____Line
	* Show source
	Local lnLevel
	lnLevel = Alen(____FoxUnit____AStackInfo,1) - 1
	EditSource(____FoxUnit____AStackInfo[m.lnLevel,4], ____FoxUnit____AStackInfo[m.lnLevel,5])
EndProc

Procedure ____FoxUnit____HandleErrorException
LParameter tcMessage, tnError, tnLine, tnStackLevel
	AStackInfo(____FoxUnit____AStackInfo)
	Dimension ____FoxUnit____AStackInfo[m.tnStackLevel,6]
	____FoxUnit____Message = m.tcMessage
	____FoxUnit____Error = m.tnError
	____FoxUnit____Line = m.tnLine
Return to ____FoxUnit____Execute____TRY_CATCH_Handler

Procedure UnitTest
	Local loObject
	loObject = CreateObject("SomeClass")
	loObject.CauseError ()
	MessageBox("Don't run this")
EndProc

Define CLASS SomeClass as Custom
	Procedure CauseError()
	
		Accept life, deal with it
	EndProc 
EndDefine 

Bug: AssertEqualsArrays - Invalid Array Subscript

When arrays differ in number of rows an invalid subscript error is reported in ReportArrayMismatch as the VARTYPE() call throws an error when the array index is 0 which is what AssertEqualsArrays passes in.

Wish: AssertNotImplemented() should return True

AssertNotImplemented() should return True, not False. Here's why:

Reason 1 (tautological): If a test is not implemented, then by definition it's always true that it is not implemented.

Reason 2 (practical): By returning False, AssertNotImplemented() trains us to ignore failing tests.

The general rule of unit testing is that all test results must be green before code is committed or shipped ("all green or no go").

When tests that are not implemented return False, we train ourselves to ignore those red results because we know they're expected. This is a problem because if we get used to seeing and ignoring any red results, even if only those from tests that are not implemented, we might miss a red result from a newly failing test that is implemented.

This can be particularly problematic in larger test suites.

Example: A test suite has 100 tests. The 76 that are implemented return True (green). The 24 that are not implemented are there as placeholders for future tests, and they return False (red). We become accustomed to seeing and ignoring the 24 red rows every time we run the whole test suite. If we then modify or refactor a method and its test subsequently fails, we might not notice there are now 25 failing tests instead of only 24. We might go ahead and commit or ship the code even though not all test results are green, and if we do that we have just released a bug.

On the other hand, if tests that are not implemented return True (green), we know all red results require action and we know we can't commit and ship the code until all tests are green.

Issues with FoxUnit as installed by Thor

Thor installs FoxUnit 1.66522 dated 11/09/2017.
Issue #1: after installation, FoxUnit does not appear in Thor's list of applications as expected.
Issue #2: launching the Thor-installed foxunit.app directly from the VFP command window crashes VFP.

image

Debug shows it's a C5 error:

image

Downloading the FoxUnit source code from here on GitHub and then building foxunit.app from the VFP project gives version 1.70, which runs as expected.

Wish: Add One-time Setup and TearDown methods

It would be nice if the FxuTestCase class had methods like Setup() and TearDown() which would be called once during the life of the test class. They might be called OneTimeSetup() and OneTimeTearDown(), and they would be called at the end of Init(), and at the beginning of Destroy(), respectively.

These methods would be useful when multiple tests in a test class need to access the same read-only data, or some object that doesn't maintain a state. That part of the test environment can be setup once for all the tests in the class, rather than every time an individual test runs.

Obviously we could enhance the Init() and Destroy() methods of the test class directly. However, the Init() already inherits parameters and initialization code, which means that a future change to FoxUnit could break any test classes that override the Init() method. And although Destroy() currently doesn't contain any code, we run the same risk there.

AssertNotEqualsArrays generates failure message on successful assertion

This issue can be reproduced by running the FoxUnitTests.PRG unit test "testAssertNotEqualsArrays". As shown in the screenshot below, the assertion correctly returns true, and the test passes, but an Assertion Failure message is generated. The failure message can be confusing to someone who isn't aware of the bug, especially if the developer included a custom message in the assert.

image

Wish: Measure execution time of each test

Unit tests should be quick to run to be actual unit tests. When running all tests in a class or several classes, it's not possible to spot which test took too long to execute.

  • FoxUnit should measure the execution time of each test in ms and display it in the list of tests.
  • A filter should allow to limit the list to tests that took longer than a specified threshold
  • A new TimeOut method can be called in either SetUp() or a test method to specify the maximum time that a test runs before automatically failing. See implementation in JUnit 4.

Bug: Using line comment prevents FoxUnit from finding the procedure

I have a procedure definition like this:

Procedure Test_LoadClass && JIRA-1234

Where there is a comment after the procedure name (It's a ticket number). This brings up the following error message:

Unable to locate the TEST_LOADCLASSmethod -- C:...\Test_Test.prg will be opened in the program editor with the cursor positioned whereever it was last time.

BTW, there should be a space after the class name. "Wherever" is the correct spelling of "whereever"

Decimal precision lost in assertion messages

I have part of a test that checks if the value coming from a Currency field in a cursor is being rounded to 2 decimal places. Because of the default setting of SET DECIMALS TO, and the way TRANSFORM() formats a currency value, the failure message from FoxUnit looks like this:

-------------------------------
------Assertion Failure
-------------------------------
Pay Rate 00004 calculated wages are incorrect
Values Not Equal
Expected Value: $250.00
Actual Value: $250.00
-------------------------------
------Assertion Failure
-------------------------------
Pay Rate 00005 calculated wages are incorrect
Values Not Equal
Expected Value: $250.00
Actual Value: $250.00

I don't want to take the time to figure out where to plug in this code right now, but I did come up with a replacement function for TRANSFORM() which preserves decimal precision on all numbers, though you do lose the currency formatting. Here's the code that does that:

LPARAMETERS tuValue

LOCAL lcSetFixed, ;
  lcString, ;
  lnSetDecimals

DO CASE
  CASE VARTYPE( m.tuValue ) = "C"
    lcString = m.tuValue

  CASE INLIST( VARTYPE( m.tuValue ), "N", "Y" )
    * Force all relevant decimal places to be displayed.
    lcSetFixed = SET("Fixed")
    SET FIXED OFF
    lnSetDecimals = SET("Decimals")
    SET DECIMALS TO 18

    lcString = TRANSFORM( IIF( VARTYPE( m.tuValue ) = "Y", MTON( m.tuValue ), m.tuValue ) )

    SET FIXED &lcSetFixed.
    SET DECIMALS TO m.lnSetDecimals

  CASE ISNULL( m.tuValue )
    * In case SET NULLDISPLAY is something other than the default.
    lcString = ".NULL."

  OTHERWISE
    lcString = TRANSFORM( m.tuValue )
ENDCASE

RETURN m.lcString

Bug: FoxUnit does not recompile unit tests

VFP adds compiled FXP code to an internal cache when it instantiates a class with either CREATEOBJECT() or NEWOBJECT(). Subsequent instantiations use the cached version. This is still true when the PRG file changed even with sET DEVELOPMENT ON. CLEAR commands are not removing the FXP file from the cache.

This matters in FoxUnit, because switching branches and reverting changes with Git will update any PRG file, but not the FXP file. If tests from the previous branch have been executed, FoxUnit continues to run the previous tests on the new branch until CLEAR ALL/RELEASE ALL is issued or VFP is restarted. The program below demonstrates this.

While it doesn't cover the classes that are tested, at least for the test classes we can make sure they are up-to-date by recompiling any test program that is loaded before running each test.

Set Safety Off
Set Development on
Clear All
Release All
Clear

Local lcCode
Text to m.lcCode noshow
Define Class Test as Custom
	Function GetResult
	Return "Hello"
EndDefine 
EndText 
StrToFile(m.lcCode, "Test.prg")
Compile Test.prg

Local loRef
loRef = NewObject("Test","Test.prg","")
? loRef.GetResult()
loRef = null

Clear Program
Clear Class Test

Text to m.lcCode noshow
Define Class Test as Custom
	Function GetResult
	Return "World"
EndDefine 
EndText 
StrToFile(m.lcCode, "Test.prg")

* uncomment this line for a fix
* Compile Test.prg

Local loRef
loRef = NewObject("Test","Test.prg","")
? loRef.GetResult()
loRef = null

Wish: Allow ESCAPE during testing

Enhance the test runner to SET ESCAPE ON and trap for the user pressing the ESCAPE key. I believe this should also cause that test to fail, and log an error message indicating that the user pressed ESCAPE.

This would be useful for scenarios where someone (who is probably not you, of course) accidentally created an infinite loop in code being tested. Instead of having to kill VFP and start over, the developer can abort the test and correct the problem in a normal refactor cycle.

How to handle tests that are not unit tests?

Most of my tests are unit tests. They test a single class or a few couple classes in an isolated way that doesn't depend on a particular setup in my environment or database. Typically, all of these tests of a project are loaded in FoxUnit. I execute them all when I'm done with a ticket or change to make sure that everything still works. These tests execute relatively fast.

However, I also have other tests that are not independent. There are tests that communicate with live web services and ensure that the WSDL file hasn't changed or that the API is still working. These tests take considerably more time.

Currently I'm working on classes that use a document scanner, so my tests even include required hardware, a UI (select the scanner) and require specific setups (put two pages into the document feeder). These tests are great for development, because I don't have to run the application to test my code and I don't have to setup some test environment. However, they are totally unsuitable to run with my unit tests. Instead I call them on a case by case basis when I make changes or need to validate certain functionality.

I've considered defining a text file in the project folder that lists different Test folders with "Tests" being the default selected, the default when no text file exists and the default when tests are run automatically. However, when the text file exists and includes a line that is not "Tests", FoxUnit would display a dialog when launched and let the user pick a folder. This would impact the DataPath property in the FXU instance.

So basically, the Tests folder would contain all unit tests. But I could also create a folder "Integration Tests" or "Hardware Tests" with a separate set of loaded test classes.

Does that sound reasonable?

Bug: Error message when editing a built-in template

Repro

  1. Click the New button to invoke the New Test Class dialog
  2. Make sure one of the two default templates is selected
  3. Click on the Modify the Selected Template button (third from the top on the right side of the list box)

Observed
"Invalid path or file name" error message is shown. The template is not opened.

Expected
To get a read-only copy of the embedded template

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.