m3ua and sua testing as part of jenkins?

This is merely a historical archive of years 2008-2021, before the migration to mailman3.

A maintained and still updated list archive can be found at https://lists.osmocom.org/hyperkitty/list/OpenBSC@lists.osmocom.org/.

André Boddenberg dr.blobb at gmail.com
Tue May 23 17:20:08 UTC 2017


Hi Harald,


Thanks for your input on "how to let tests fail"!

Before speaking about the publishing of XML reports on Jenkins, I'd
like to make a small infrastructure-detour after reading your blog
post.

Long story short, here's a Dockerfile (attached) capable of running
tests on top of a compiled (via ./contrib/jenkins.sh) libosmo-sctp
repository. Only one container is used and IPs can be created on
loopback interface in fact of using "--add-cap=NET_ADMIN", which imo
is a mighty feature - able of pushing docker beyond its proposed use
cases.

After building the Dockerfile execute following "line" inside compiled
libosmo-sctp repo:

docker run -it --rm --cap-add=NET_ADMIN \
  -v $(pwd):/workdir/libosmo-sctp \
   <NAME_OF_IMAGE> ./test-m3ua.sh 2>&1 | tee result.plain

Note: result.plain will be saved in directory where docker run is
executed. In other words the pipe wraps docker command - not command
executed within docker.

Imho this is what we need in order to: "... execute the entire
testsuite e.g. during jenkins test without having IP address or port
number conflicts. It could even run multiple times in parallel on one
buildhost, verifying different patches as part of the continuous
integration setup." [1]. Moreover, we could also split up tests and
execute them in separate containers inside the same job, allowing us
to cut down the verification time of each patch set.

Does the container work for you, too?

What do you think about such approach in general?


Furthermore I wrote a protopy-ish script (attached) able to parse
"result.plain" (generated by above stated docker command) and create a
valid JUnit XML report, which then can be parsed and visualized by
Jenkins.

A test job [2] has been created showing how such a "publishing" looks
like. One simply clicks on "Latest Test Result" and sees a list of all
failures. Each failure can be expanded to see the
"Stacktrace:/Backtrace:". Furthermore, the first line of such a
"Stacktrace" specifies whether it is a "pure" failure or a timeout.

The console log of a jenkins job using mentioned Dockerfile and script
(create_xml_report.sh) would first hold the verbose log of the docker
command. Then a summary would follow as an result of executing
create_xml_report.sh. Thus giving full details as well as a summary
inside the console log. (Although the summary would use multiple lines
per test case -> less beautiful)

In addition I'd like to mention the possibility of saving duration of
each test case inside XML report. This could enable performance
statistics!?


What do you think about such approach of creating and publishing XML
reports in general?


Please note that create_xml_report.sh still needs a clean up, but at
the same time I thought it's already worth sharing my hands-on
results. Furthermore, feel free to request more details on specific
topics e.g. JUnit XML reports, Jenkins, Docker (tried to keep mail
crisp).


Best Regards,
  André

[1] http://laforge.gnumonks.org/blog/20170503-docker-overhyped/
[2] https://jenkins.blobb.me/job/check_xml_report/
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Dockerfile
Type: application/octet-stream
Size: 2040 bytes
Desc: not available
URL: <http://lists.osmocom.org/pipermail/openbsc/attachments/20170523/75184ec7/attachment.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: create_xml_report.sh
Type: application/x-sh
Size: 1774 bytes
Desc: not available
URL: <http://lists.osmocom.org/pipermail/openbsc/attachments/20170523/75184ec7/attachment.sh>


More information about the OpenBSC mailing list