Topics

Ideas for higher-level badges


David A. Wheeler
 

All: I'd like to start creating the criteria for "higher level" badges. Please reply, or add issues, for things you think should be included.

 

A very early draft is here:

https://github.com/linuxfoundation/cii-best-practices-badge/blob/master/doc/other.md

Stuff we could add at a higher level include test coverage criteria, bus factors, etc.

 

For the moment we should probably call these "passing+1" and "passing+2"... we can name them silver/gold/platinum/whatever later.

 

--- David A. Wheeler

 


Kevin W. Wall
 

On Thu, Dec 15, 2016 at 7:42 PM, Wheeler, David A <dwheeler@ida.org> wrote:
All: I'd like to start creating the criteria for "higher level" badges.
Please reply, or add issues, for things you think should be included.



A very early draft is here:

https://github.com/linuxfoundation/cii-best-practices-badge/blob/master/doc/other.md

Stuff we could add at a higher level include test coverage criteria, bus
factors, etc.
David,

My $.02 on the early draft. Note that I've not gone back to review the base
criteria and it's been awhile since I've revisited that so apologies if I
mention something that is already there.

***** Potential passing+1 criteria *****

Under "Regression tests", it states
When a bug is fixed, a regression test MUST normally be added to
the automated test suite to prevent its reoccurrence.

"MUST normally"? That sounds like SHOULD. If it isn't SHOULD, then why the
"normally"? If this is mandatory, but there are exceptions, then those
exceptions will have to be explicitly spelled out.
-----

Under "Reproduceable [sic] build":

This has proven difficult because of upgrades to various build tools
(compilers, assemblers, linkers, etc.) and possible OS dependencies
(e.g., on system headers and libraries). It would be one thing that
says that product XYZ was built on a default install of version m.n.p of
ABC distro of Linux (but of course, even there, how many developers do
you know who have default installs?), and another thing to have to
figure out everything that needs to be specified in the build tool
chain and gather all those dependencies as well as the exact architecture
it was built under. I look at it this way...if someone doesn't trust
the binaries, then let them build it themselves. This is, after all,
supposed to be open source, so that should always be possible. As
for the source, what you see is what you git. (Sorry, I couldn't
resist.) No, seriously, I'm okay with requiring binaries be digitally
signed or at a minimum, requiring a hash, but I think it we start
requiring reproducible builds it will result in those who wish to
comply to include the whole bloody tool chain in GitHub or whatever
repo they are using.

-----

Under "Documentation", it states:
The project MUST include reference documentation that describes
its data flow.

Question: how does one do that for a general API project and is it even
meaningful?

-----

A big +1 on "Security analysis / dependencies".

-----

***** Potential passing+2 criteria *****

General criteria / Roadmap exists
I would like to see this moved to the 'passing+1' criteria. It's not
that difficult and when developers are initially evaluating different
possible alternatives, this can provide valuable insight.

If passing+2 is going to be the highest back level, I'd also like to see
some sort of mandatory code inspection (possibly SAST assisted), and
when applicable, some sort of DAST (for APIs, probably just fuzzing),
where failed tests would have to be added to the regression test suite.


***** Potential other criteria *****

I thinks "Issue tracking for defects" has to be mandatory in some form
or other for the passing+1 level. It doesn't have t be a full blown
issue tracking system like JIRA, but at least something like GitHub issues.

Under 'Security', you state:
Developers contributing a majority of the software (over 50%) have
learned how to develop secure software

Question: Exactly how would you measure that? Do you just except them to have
some security-related certification or take some specific course or what?

-----

+1 on ASVS

-----

+1 on cryptographically signing binaries


Okay, so that was more like $1.23. Anyhow, hope that is some useful feedback.

-kevin
--
Blog: http://off-the-wall-security.blogspot.com/ | Twitter: @KevinWWall
NSA: All your crypto bit are belong to us.


Daniel Stenberg
 

On Thu, 15 Dec 2016, Wheeler, David A wrote:

All: I'd like to start creating the criteria for "higher level" badges. Please reply, or add issues, for things you think should be included.
This list certainly increases the requirement level a lot and they're all good practices. However, I'm pretty sure none of the projects I'm involved in during my spare time will be able to even reach passing+1 as suggested now. Simply because of lack of manpower and energy. This includes projects I've spent 10,000+ hours on. (And honestly, most proprietary projects I've worked on during my 25+ years as a software developer wouldn't either.)

But that might not be a bad thing. That might even be exactly what's intended. I'm fine with leaving the higher level badges for projects with more paid staff or perhaps just more (or more efficient) contributors that can manage to keep those practices up. Ideally this leads to volunteers showing up wanting to help us reach passing+N compliance.

This is *not* a complaint on the new suggested levels even if it may sound like that. I think it'll bring light on areas that can and should be improved.

--

/ daniel.haxx.se


David A. Wheeler
 

From: Daniel Stenberg [mailto:daniel@haxx.se]
This list certainly increases the requirement level a lot and they're all good practices.
Great! That's encouraging!

However, I'm pretty sure none of the projects I'm involved in during
my spare time will be able to even reach passing+1 as suggested now.
Simply because of lack of manpower and energy....
But that might not be a bad thing. That might even be exactly what's intended.
Well - we don't want to create badge levels that *no one* will meet. But It's relatively easy to remove criteria, move criteria to a higher level, or weaken "too hard" criteria in some way. I want to move beyond "blank slate".

I currently expect that we'll work to create a "fair but somewhat hard" set of lists, then tweak them to make the +1 and +2 levels easier. Not *too* easy though - there's no point in giving prizes for breathing :-). My current thinking is that a smaller set of projects will work on the higher levels. Over time, some of the higher-level criteria will start to trickle down - especially as technology makes them easier to do.

I'm fine with leaving the higher level badges for projects with more paid staff
or perhaps just more (or more efficient) contributors that can manage to keep
those practices up. Ideally this leads to volunteers showing up wanting to
help us reach passing+N compliance.
Agreed. In particular, I expect "passing+2" will be difficult and only a relatively-small set of projects will meet it. But if everyone agrees that those criteria *should* be done, it could lead to volunteers helping. It could also be the basis for a funding request.

This is *not* a complaint on the new suggested levels even if it may sound
like that. I think it'll bring light on areas that can and should be improved.
Thanks.

I think it's important to winnow out criteria that aren't really a good idea. In particular, I want to avoid "hype driven development" <http://thenewstack.io/programmers-react-warning-hype-driven-development/>; - we should specify criteria that are *proven* and technology-independent. Software development is remarkably subject to fads; when bell bottoms appear, no one seems to ask if something is *actually* a good idea.

--- David A. Wheeler


David A. Wheeler
 

-----Original Message-----
From: Kevin W. Wall [mailto:kevin.w.wall@gmail.com]
My $.02 on the early draft....

Thanks so much!! We've tried to incorporate your thoughts.


***** Potential passing+1 criteria *****
Under "Regression tests", it states
When a bug is fixed, a regression test MUST normally be added to the
automated test suite to prevent its reoccurrence.

"MUST normally"? That sounds like SHOULD. If it isn't SHOULD, then why the
"normally"? If this is mandatory, but there are exceptions, then those
exceptions will have to be explicitly spelled out.
You're right, that's a problem. Hmm. We really need for these criteria to be cut-and-dry. How about this:

- The project MUST add regression tests to an automated test suite
for at least 50% of the bugs fixed within the last six months.
<sup>[<a href="#regression_tests_added50">regression_tests_added50</a>]</sup>

*Rationale*: Regression tests prevent undetected resurfacing of
defects. If a defect has happened before, there is an increased
likelihood that it will happen again. We only require 50% of bugs to
have regression tests; not all bugs are equally likely to recur,
and in some cases it is extremely difficult to build robust tests for
them. Thus, there is a diminishing point of return for adding
regression tests. The 50% value could be argued as being arbitrary,
however, requiring less than 50% would mean that projects could
get the badge even if a majority of their bugs in the time frame
would not have regression tests. Projects may,
of course, choose to have much larger percentages.
We choose six months, as with other requirements, so that projects
that have done nothing in the past (or recorded nothing in the past)
can catch up in a reasonable period of time.


Under "Reproduceable [sic] build":

This has proven difficult because of upgrades to various build tools
(compilers, assemblers, linkers, etc.) and possible OS dependencies (e.g., on
system headers and libraries). It would be one thing that says that product
XYZ was built on a default install of version m.n.p of ABC distro of Linux (but
of course, even there, how many developers do you know who have default
installs?), and another thing to have to figure out everything that needs to be
specified in the build tool chain and gather all those dependencies as well as
the exact architecture it was built under. I look at it this way...if someone
doesn't trust the binaries, then let them build it themselves. This is, after all,
supposed to be open source, so that should always be possible. As for the
source, what you see is what you git. (Sorry, I couldn't
resist.) No, seriously, I'm okay with requiring binaries be digitally signed or at
a minimum, requiring a hash, but I think it we start requiring reproducible
builds it will result in those who wish to comply to include the whole bloody
tool chain in GitHub or whatever repo they are using.
It's not as hard as it used to be, because of the widespread availability of containers, and Debian's efforts have improved many tools.

That said, I get your point. How about this: let's try to separate *repeatable* builds (where a project can get the same answers) and *reproducible* builds (where external parties can reproduce the results). Then we can put *repeatable* builds in passing+1, and full reproducible builds as a MUST in passing+2 (we might make them SUGGESTED at a lower level).

For the *repeatable* builds, here's a cut:

- <a name="build_repeatable"></a>
The project MUST be able to repeat the process of
generating information from source files and get exactly
the same bit-for-bit result.
If no building occurs
(e.g., scripting languages where the source code
is used directly instead of being compiled), select "N/A".
GCC and clang users may find the -frandom-seed option useful;
in some cases, this can resolved by forcing some sort order.
More suggestions can be found at the
[reproducible build](https://reproducible-builds.org/) site.
<sup>[<a href="#build_repeatable">build_repeatable</a>]</sup>

*Rationale*: This is a step towards having a
[reproducible build](https://reproducible-builds.org/).
This criterion is much easier to meet, because it does not require
that external parties be able to reproduce the results - merely
that the project can.
Supporting full reproducible builds requires that projects provide
external parties enough information about their build environment(s),
which can be harder to do - so we have split this requirement up.
See the [reproducible build criterion](#reproducible_build).


Under "Documentation", it states:
The project MUST include reference documentation that describes its data
flow.

Question: how does one do that for a general API project and is it even meaningful?
Fair enough. Dropped.

A big +1 on "Security analysis / dependencies".
:-). Thanks!

I'll separately post about the others.

--- David A. Wheeler


Mark Rader
 

David

Im reading through the level criteria right now.  One thought that comes to mind by way of design.  If you are going to have multiple levels I would suggest changes to the web site where by the criteria for each level are checked simultaneously, and the criteria are marked for each level. 

So a person would see 75% level 1.  65% level 2 etc.  That way they could gauge overall progress.  Also have the fill in boxes color coded to each level so as the questionnaire is filled out so to speak you can identify which blocks are needed to attain the which level.

A lot of getting people to do this will be to inspire them by showing how close they really are to getting then next level and making it a kind of quest.

Also, one of the things that may be helpful, is showing that this is not only a way to help with security practices, but it also becomes both a "Management" tool for helping with the development of the project as a whole, but with the higher level badges (and even the basic badge) it provides for a framework to help ensure the software project continues after the departure of the originators and the baby, so to speak, will continue to grow.  It's nice to emphasize it is a security best practice, but some of the real value add so to speak is also in the management and aid in succession/transition of projects once they outgrow the originator.

Mark

On Thu, Dec 15, 2016 at 6:42 PM, Wheeler, David A <dwheeler@...> wrote:

All: I'd like to start creating the criteria for "higher level" badges. Please reply, or add issues, for things you think should be included.

 

A very early draft is here:

https://github.com/linuxfoundation/cii-best-practices-badge/blob/master/doc/other.md

Stuff we could add at a higher level include test coverage criteria, bus factors, etc.

 

For the moment we should probably call these "passing+1" and "passing+2"... we can name them silver/gold/platinum/whatever later.

 

--- David A. Wheeler

 


_______________________________________________
CII-badges mailing list
CII-badges@lists.coreinfrastructure.org
https://lists.coreinfrastructure.org/mailman/listinfo/cii-badges