[Fuego] user experience of fserver

Bird, Tim Tim.Bird at sony.com
Thu Feb 13 12:47:57 UTC 2020



> -----Original Message-----
> From: Fuego <fuego-bounces at lists.linuxfoundation.org> On Behalf Of Wang, Mingyu
> 
> hi Tim,
>  
> I tried the fserver according to the user manual.
> 
> I was able to successfully use one local server to send requirements to another local server and get results.
> 
> It has not been verified whether the remote server can perform smoothly.
> 
> 
> 
> By using it, there are some inconveniences  as follows:
> 
> 1. The directory of the log file cannot be customized
> 
> 2. No information such as request and runs can be seen on the web interface (I don't know if I'm wrong)
> 
> 3. The results are not intuitive, a lot of information is confusing
> 

Thank you very much for this feedback.  I feel a little bad because I know the
code is not production-grade yet, but I was happy to hear that at least some
things worked for you.  I had hoped we could get some utility out of the
feature, even though it is not complete (see below).

What did you mean: "The directory of the log file cannot be customized" ?
Are you talking about on the side that pulls the run information from the
server, or on the server itself?

Items 2 and 3 are correct.  The feature needs more work to be ready for 
full use as a Fuego feature.  I'm still not sure if placing the results into the
local Jenkins interface is the right thing to do or not.  Right now there is
no place for the results to land, because there is no local Jenkins node for
the remote board.  We could make a "proxy" node for a remote board, so
that results that were performed by a board in another
lab could be put into the interface for that proxy.  This would require that
I enhance the ability to put non-local data into the Jenkins build area, which
is something I probably need to do anyway.  This is something that will take
a bit of time.

However, let me describe how I'd like to use this feature in the short term,
to see if how I envision using it would work for you.

One problem I have as a developer of Fuego is that when I get a test
that does not apply to any of my boards I have no way to "test the test".
I would like to execute it on a board to which is does apply, and examine
how it works, and what issues it might have being generalized for other boards.

To do this, I would like to be able to run a test on a board in another lab.
Specifically, if Fujitsu submits a test that runs on one of their boards or
distributions of Linux, I would like to be able to execute the test remotely,
and get the results back and examine them.

I am thinking the flow would be like this:
1) Fujitsu submits a new test to Fuego, to the Fuego mailing list
1a) Fujitsu provides the name of a lab and board that is available for executing the test
1b) Fujitsu would execute poll_requests.sh, to automatically poll for jobs for that board.
2) I review the test, and if I have issues or cannot test it, I execute the test on their board
This means that I will:
2a) ftc put-test the test to the public fserver (on fuegotest.org)
2b) Fujitsu's lab's poll_requests.sh would execute the test and put the results on the
public fserver.
2c) I would collect the results and use it to validate the test
3) I may modify the test to experiment with different alternatives for executing it
For example I may add extra debugging steps, or I may change the flow of execution.
After this, I would repeat step 2 until I was satisfied, or until I completed the review
and returned feedback to Fujitsu about the test.

The biggest problem I see with this flow, is that in step 3 I can introduce any arbitrary
change to the test, including something that (inadvertently) wiped out the board, or
(if I were malicious) probed the Fujitsu network.  I can imagine that it would be difficult for
Fujitsu (or any company) to allow running a test it obtained from outside their network,
inside their network.

Alternatives to this flow, that give me the same benefit of being able to execute the
test on a Fujitsu board:
1) Have Fujitsu send me detailed execution logs (with FUEGO_DEBUG=255) for 
tests that are submitted, so I can see how they execute on your systems.
I request manual changes to the test, and someone at Fujitsu changes the
code, runs the code against their boards, and sends the results (this is just the
same thing we're doing now, but with debug test logs attached.)
2) Get a Fujitsu board (and distro?) and put it in my lab locally.  That way Fujitsu
doesn't have to run a test from outside their private network.
3) Do this process manually, instead of automatically.  I could still do the
put-request, but then a Fujitsu person would need to execute commands
to download the test, execute it, and put the results back on the fserver.
There could be a step where they manually checked that the test didn't
do anything weird or dangerous.
It wouldn't be real-time for me, which would delay my review of the test,
but it would still provide me more detailed information, and allow me to
test alternatives.

Let me know what you think.
 -- Tim







More information about the Fuego mailing list