[Fuego] [kselftest] Separating testcases into their corresponding group

Daniel Sangorrin daniel.sangorrin at toshiba.co.jp
Wed Nov 15 08:20:34 UTC 2017


Hi Tim, Tuyen

> >> +++ b/engine/tests/Functional.kselftest/criteria.json
> >> @@ -0,0 +1,13 @@
> >> +{
> >> +    "schema_version":"1.0",
> >> +    "criteria":[
> >> +        {
> >> +            "tguid":"exec",
> >> +            "fail_ok_list": ["execveat"]
> >> +        },
> >> +        {
> >> +            "tguid":"timers",
> >> +            "fail_ok_list": ["rtctest"]
> >> +        }
> >> +    ]
> >> +}
> > This is good for docker, but may not be correct for other boards.
> > I accepted this as is (as the generic criteria file for Functional.kselftest).
> > But I actually think we should move the file to the fuego repository, as:
> > fuego/fuego-ro/boards/docker-Functional.kselftest-criteria.json.

I think that's a great idea.

> > Fuego will read a per-board criteria file from that location, and use it in preference
> > to the general criteria file.  (This one would be our very first board-specific
> > criteria file!)  See fuego-core/engine/scripts/parser/common.py:load_criteria()
> > and http://fuegotest.org/wiki/criteria.json#Customizing_the_criteria.json_file_for_a_board
> >
> > Let me know if you have any objections to moving this there.  We would still
> > then need a generic criteria.json file for Functional.kselftest.

Do we need a generic criteria.json in this case?
If I recall correctly, a test without a criteria.json would have a default criteria assigned (all must 
pass except the ones skipped). 

By the way, I think that kselftests not only output PASS and FAIL there is also UNSUPPORTED (the test was skipped).
Should we add that case to the parser so the run.json contains also entries for test cases that were SKIPPED?

Thanks,
Daniel


 






More information about the Fuego mailing list