[Fuego] Parameter passing to tests

Bird, Timothy Tim.Bird at sony.com
Sat Jun 3 03:35:19 UTC 2017



> -----Original Message-----
> From: Daniel Sangorrin on Friday, June 02, 2017 10:19 PM
> > -----Original Message-----
> > From: Rafael Gago Castano [mailto:RGC at hms.se]
> > > At the moment, what you can do is to configure your job:
> > > Jenkins > Testname > configure
> > >     export PARAM1="hello param 1"
> > >     export PARAM2="hello param 2"
> >
> > What I was doing was adding the parameters to make a parametrized build
> instead
> > because we favor working from the CLI (we have to automate everything
> and do
> > CI), but this is a valid solution too.
> >
> > > That should achieve the same result as using the parameterized build
> plugin. But it would be more
> > > convenient to add an option to ftc:
> > > ftc add-jobs -p PARAM1="..."
> >
> > Now you can use add-jobs to add multiple jobs at once, so then a real call
> would
> > be like:
> >
> > ftc add-jobs job1 -p PARAM1="p1" -p PARAM2="ps" job2 -p PARAM1="p1"
> ...
> >
> > What I don't like of this approach is that the extra parameters are far away
> > from where there are defined, so adding a parameter to a test needs a
> > modification in three places (the test, the spec and the script that adds the
> > jobs). I suspect that this wouldn't bail out early when you do a typo or
> forget
> > that you refactored a parameter name.
> >
> > What I had thought was to modify the JSON file:
> >
> > {
> >   "testName": "Functional.serial",
> >   "params":
> >   [
> >     {
> >       "name"        : "DEV",
> >       "description" : "Serial device to test",
> >       "default"     : "/dev/ttyPS0"
> >     },
> >     {
> >       "name"        : "BAUDRATES",
> >       "description" : "Space separated baudrate list in bps",
> >       "default"     : ""
> >     }
> >   ],
> >   "specs":
> >   [
> >     {
> >       "name"      : "minimal",
> >       "BAUDRATES" : "9600 115200",
> >     },
> >     {
> >       "name"      : "full",
> >       "BAUDRATES" : "<big list with all Linux baudrates>",
> >     }
> >   ]
> > }
> >
> > By doing this way:
> >
> > -Everything is kept compatible with what's in place now.
> > -All the definitions are still done in one single file.
> > -All available test parameters are integrated on both the jenkins web and
> >  command line interface and can be overriden for a single run.
> > -The different specs generate different project names in jenkins, and this
> can
> >  be used for our advantage when different specs test different things
> (more
> >  baudrates on this example). At the same time it still keeps out user specific
> >  data out from the fuego repository.
> > -There is the parameter description that acts as test documentation (and is
> >  shown on the Jenkins web interface too).
> > -All the parameters in the spec can be validated (for typos).
> > -The cli for add-jobs is kept as it is now (a bit simpler).
> > -The user can override one parameter while still using the spec (an
> test/jenkins
> >  project name), e.g. if a serial device uses the full spec (and wants to
> appear
> >  on jenkins "board.Functional.serial") but doesn't support some all the
> baud
> >  rates on the full baud rate list.
> 
> I like the fact that we can have defaults for each parameter and
> that it can serve as an explanation of what each parameter does and
> the list of parameters supported by the test (so it would deprecate the yaml
> files I guess).
> 
> My only concern then is how do you avoid Jenkins job collision?
> For example, if you create 2 jobs with the same spec but you override the
> parameters.
> 
> > Implementation remarks (that I can think of just now):
> >
> > -All the test paramer values under "params" should be echoed before
> >  initialization, so they end up in the jenkins log.
> 
> Currently they will be echoed to prolog.sh and jenkins log automatically.
> 
> > -All the parameters would take the same name that they take now, but
> they
> >  would be defined by jenkins (or at least the parameter on the scripts
> would get
> >  the value from a jenkins parameter with a similar name).
> > -The default value on jenkins can come from the "default" field on the
> parameter
> >  map or from the "spec". So we could say that the spec values just override
> the
> >  default value of the parameter. Which is enough for launching tests
> without
> >  parameters.
> > -If the params array is defined all the different parameters in the different
> >  specifications are validated for existance (typos). If the params array isn't
> >  defined the parameters array is deduced from all the parameters
> populated on
> >  all the different test specs.
> 
> I think there are two types of validations.
> 1) Check that there are no misspellings. This can be done in a generic way for
> any
> test with the "params" list that you propose.
> 2) Check parameter dependencies. For example, if PARAM1 is defined then
> PARAM2
> should not be empty or something like that. We could do part of this
> generically by adding
> some syntax to the JSON spec or leave it to each fuego_test.sh.
> 
> > > Configuring your job with new params is OK for debugging but in the long
> run
> > > you may also want to save those values into your board's testplan.
> >
> > In our case it would be more convenient to skip the testplan feature
> completely.
> > We maintain 3 releases at the same time for many devices and that would
> end up
> > with tedious duplication if we don't generate the files dinamically. If we are
> > to script to generate the files it's the same effort to output each entry of
> > the script processing to the jenkins CLI than to an intermediate testplan
> file.
> 
> Yes, a script calling "ftc add-jobs" with the parameters is another perfectly
> valid option.
> 
> > We are dependant on test run ordering too for because we have a board
> > setup/teardown test, but I guess that we hit a bug in this case (things
> stored
> > inside a python dictionary?).
> 
> Ordering is currently not implemented.
> 
> One option is to add a --after paramter like this:
> ftc add-jobs Functional.Kselftest --after Functional.kernel_build --spec 4.4.y
> # and the same for the testplans
> 
> This can be easily implemented in Jenkins.
I think a flag to automatically order tests in a testplan sequentially would be good.
The mechanism for this could be the 'after' functionality in Jenkins.  But I think
the use case of 'run these tests in the order listed' is quite common.

> 
> > > My proposal is as follows:
> > >   - testplans should generally be created by the user. We provide some
> testplans for reference only. In
> > > other words, a testplan is where you put your "dirty" changes.
> >
> > I agree. Fuego should aim to keep its repositories free of user-local
> > modifications.
> >
> > > - The idea is that you could add 3 jobs to your testplan using the same
> iperf spec, and then _OVERWRITE_
> > > those variables that you want to change (e.g. IP address)
> >
> > IMO the JSON approach described + RAW jenkins enqueueing is
> preferrable for us,
> > but either your proposed solution or mine renders the testplan feature
> > incomplete. If I understand correctly the _OVERWRITE_ would be a new
> > keyword/parameter on the testplan files to fix exactly this.
> >
> 
> There would be 2 ways of overriding a parameter:
> 1) by specifying the value in "ftc add-jobs -p PARAM1=xxx".
> 2) by specifying the same parameter on the testplan. The testplan parameter
> would then
> take precedence over the one defined in the spec.
> # however I still have the concern of having job name collision
Other ways to override the parameter:
3) 'ftc run-test -p PARAM1=xxx' or 'ftc build-job -p PARAM1=xxx'
4) In the Jenkins interface, change the value for the parameterized variable, using the web interface

 -- Tim



More information about the Fuego mailing list