Date   

Re: New Badges! Congrats!

Daniel Stenberg
 

On Mon, 22 Aug 2016, Wheeler, David A wrote:

* c-ares: https://bestpractices.coreinfrastructure.org/projects/291 - a dependency of curl's
I mentioned this to David in private already, but I have this vision that I would like curl (who reached 100% back in March) to also have all, or at least a significant portion, of its dependencies as "100% projects".

curl as a tool and library can be built to use an insane amount of different dependencies (it might be 21 different ones, many of them mutually exclusive).

Because, even if you can see one project be "sensible" and show off a 100% best practices badge, what is it actually worth to the end user if it uses N dependencies that are not?

--

/ daniel.haxx.se


Re: Finally completed badge, feedback on process

 

Apologies on the lack of brevity in advance.

Getting to 100% passing was relatively easy for BRL-CAD with
only one MUST item arguably being unmet beforehand (our website
certificate didn’t match our domain, fixed). The rest was mostly a matter of
documentation and elaboration.
Good to hear. I would expect most well-run projects to "mostly" do well. Fixing domain certificates, and doing a little documentation & elaboration, Is (to me) a *good* thing.
The "documentation and elaboration” I mentioned was actually on the badging form itself (i.e., the various explain/justify prompts).

In BRL-CAD's case, we fortunately already had everything publicly available and documented in expected locations. Of course, always room for improving, consolidating, and simplifying documentation so information is easier to find.

I also note there isn't a requirement against the certificate being self-signed, having a name mismatch, or having no chain. Might be worthy of a SHOULD clause.

Another feedback point that I’m not sure how / if the badging process could capture: Accessibility of information. In filling out the information for BRL-CAD, I was able to find and cite everything. However, it was spread across a half dozen different resources, some on the web, some in the repo, etc.

For criteria like the license, it spells out a standard location (e.g., COPYING in the repo). What about doing something similar for some of the other documentation items?

For example, under the first Documentation section, it says that docs must exist. Is there best practice on where those docs must be? I’d argue that a source code repository MUST 1) have some sort of obvious introductory documentation (e.g., a README in the top-level directory for most projects) 2) have contributor documentation provided in there or in another obvious location (e.g., in a similarly obvious top-level document like CONTRIBUTING.md, HACKING, DEVINFO, etc.).

We certainly *could* add 3rd-party review gating. For example, we don’t advertise this much, but I *do* review each passing badge to look for nonsense. But it's merely a brief look for nonsense, not a rigorous re-check.
Unfortunately, you don’t scale. ;)

So the question is... how could we scale that 3rd-party review? My current position is to focus on improving automation... which would make everyone happy (faster to get a badge, more rigorous checking). Are there good scaleable alternatives?
I could see two ways working, both requiring implementation effort.

The first would be to simply add a MUST peer-review sign-off criteria that N others must mark as Met/Unmet. The system would randomly contact N (e.g., == 3) of the 100% completed projects and ask them to review their peer. If more than half concur, they join the 100% club. Basically, treat it like an academic paper peer review.

An even better approach — but more work to implement — would be to take a page from stack overflow. Let registered readers mark individual criteria responses as satisfactory or unsatisfactory, and leave comments. When a statistically significant % (e.g., majority after >3 votes) disagree, the item gets set to Unmet and the reviewer must do some work and try again or appeal to higher authority for arbitration. With the social approach, the system easily expands to review queues where the workload is shared.

I certainly agree that centralized version control is viable (having used sccs, rcs, cvs, and subversion at one time or other). The argument for this criterion is that distributed systems tend to make it easier to collaborate (because you can easily re-heal changes initiated at different times)... and since it's only SUGGESTED, it does not *mandate* decentralized version control.
It certainly makes it easier for unaffiliated contributors to collaborate, which can be good but is not necessarily a good thing strictly speaking. For core devs, it’s not true as I can take a team of devs, put them on a centralized repo (regardless of SCM) and establish commit norms that will be more efficient for that team’s collaboration (unquestionably at restricted capability with others, but therein derives the efficiency savings).

There are also other merits of a centralized repository in terms of positively controlling source code quality, conducting verification and validation, QA, etc. If a group is going for ISO 9000 or CMMI certification, this is arguably far easier to ensure, track, and reasonably prove with a centralized repository. You can have failure and successes in communication with both central and distributed.

That said, we could certainly drop this criterion. It's only SUGGESTED (which hints at your next point), so it doesn't have much "oomph" - and shortening the criteria a little bit is a good thing. I think we should *expect* that as we get more projects & experience there will improvements and tightening of the criteria. The *much* more important issue is to *have* version control, and I think it's generally agreed that version control should stay as a MUST.
Absolutely a MUST. I suggest half-merging that last one with the first criteria: “The project MUST have a version-controlled source repository using common version control software (e.g., git, svn), accessible via public URL.

Reasonable enough. I think the SUGGESTED items have some value, because psychologically people don't like to admit they don't do something (if they think they should be doing it). But I could be mistaken.
I might agree for most of the SHOULD items, but adding another level even less weak made me (as reader) dismiss their value even when fully met — you’re not willing to commit to them thus I have no reason to care. If I don’t meet one of them (e.g., use common build tools), I’m probably going to feel completely justified (my tools are superior). There’s nothing to “admit”.

That's certainly true in a sense. The point, though, is that the project has to *tell* reporters how to do the private reporting. A lot of projects just don't tell people how to report vulnerabilities (they've never considered the possibility), so the real goal is to get them to write it down ahead-of-time.
I gathered as much, so perhaps a change in wording to clarify: “If private vulnerability reports are supported, the project MUST publicly document the process for sending information privately.”

Well, I've seen some non-working build systems, but you're right, people don't want to *admit* they're not working.
We’ve all seen them, and most of the time the stream dev will simply (often with justification) claim the failure is a problem with the environment, an unsupported configuration, etc.

Would changing "Working" to "Automated" be an improvement?
I think what’s missing is to define either term. Maybe something like this:

"If the software requires building, the project MUST document supported environments, dependencies (libraries and tools) required for installation, and default build instructions that work automatically for those environments.

The problem is that it depends on the overall platform (including language and underlying OS) & the tools available. Making it a MUST would be too harsh for many cases. For example, you can find more problems by enabling more warnings, but those higher levels tend to be much noisier. We don't want to *discourage* people from using more sensitive (though noisier) tools.
Commented on this in the other e-mail reply to Daniel.

Cheers!
Sean


Re: Finally completed badge, feedback on process

 

Which critieria are you saying is taking a position on this?
David's reply covered this; it’s the last SUGGESTED item under public version-controlled source repository.

6) Treating warnings as errors shouldn’t be a suggestion. Projects SHOULD be maximally strict, treating warnings as errors, with minimal exceptions (e.g., less than 1 exemption per 100 files). Frankly, I think it should be a MUST.
"maximally strict" is not a binary option, though. In many languages you can select levels of compliance or which standard to use for warnings. Like in C, lots of things in the older standard cause warnings in the newer etc. Compilers these days also allow warnings as help, they don't necessarily identify errors but *possible* errors. Would a single warning about an suspected source code indentation problem be a reason for a project to not follow best practices?
Completely agree, and absolutely not a reason. The implication is that each project would be setting a bar wherever it makes sense for them. The SHOULD/MUST requirement would merely be that the project HAS a bar defined somewhere that they strictly maintain. If they want to go hog wild with -Wextra -Weverything-else-under-the-sun and address them, more power to them. There's reasonable argument that even addressing false-positive code smells will improve code quality.

Also, warnings on which platforms? I work on projects that build on virtually every architecture and operating system that run on 32bits. Making them all build warning-free everywhere is if not impossible (in some cases the warnings are literally impossible to fix since they're mutually exclusive on different platforms), not something that would benefit the project or improve the source code.
Building strict on MSVC is notoriously problematic where some of their warnings actually amount to “WARNING: we’re now standard’s compliant, and you might have previously relied on undefined behavior.” Their documented solution is to pragma disable! Other warnings actively harm the code too by increasing maintainability, error-proneness, etc.

I think this is again a matter of the project having set their bar somewhere, anywhere. Even on Windows, there are a plethora of really good warnings that it will uniquely issue and once a project is clean, they can add it to their strict-level check to maintain that level of code quality going forward.

My (so far) two listed 100% projects would not be 100% if this would be a MUST and say "on all platforms". And in fact, that would make it really hard for a very large number of portable C and C++ projects.
On all platforms is impossible if only for the mutual exclusion issue you mention. I would suggest replacing the item with something like this:

The project MUST have a build configuration option that is maximally strict where some level of warnings are treated as errors.
[Show details]
It’s acknowledged that some platforms and compiler warnings are notoriously problematic and can be mutually exclusive or outright counterproductive. Other warning levels often contain high levels of false positives, but even addressing them can lead to improved code quality (see https://en.wikipedia.org/wiki/Code_smell). The recommended best practice is that a project defines a minimum warning compliance level that they actively maintain for one or more platform configurations. A project SHOULD, by default, treat warnings as errors for some set of warnings, however minimal, and strive to add additional warnings to their strictness repertoire over time.
[/Show details]

Agreed! It is also somewhat related to: https://github.com/linuxfoundation/cii-best-practices-badge/issues/463 "Add support for large badge"
Note there is a quick CSS workaround by modifying the tag: <img style="height: 64px; width: auto;” …>. Noted on the issue.

Cheers!
Sean


Re: Finally completed badge, feedback on process

David A. Wheeler
 

I presume you mean the <title>...</title> value in the HTML page, which shows up on browser tabs. You're absolutely right, and we can quickly fix this too.
Dan Kohn [mailto:dan@linuxfoundation.org]:
This is an obvious bug and easy fix and I'm happy to submit a PR.
Sure! I like pull requests! :-)

--- David A. Wheeler


Re: Finally completed badge, feedback on process

Dan Kohn
 

> 7) The “BadgeApp” title on individual badging pages make for a terrible title

> when sharing with others (e.g., via Facebook). Suggest something like “CII’s

> Best Practices Badge for $project_name"


I presume you mean the <title>...</title> value in the HTML page, which shows up on browser tabs.

You're absolutely right, and we can quickly fix this too.


This is an obvious bug and easy fix and I'm happy to submit a PR.
--
Dan Kohn <mailto:dan@...>
Executive Director, Cloud Native Computing Foundation <https://cncf.io/>
tel:+1-415-233-1000


Re: Finally completed badge, feedback on process

David A. Wheeler
 

Christopher Sean Morrison:
I finally pushed enough off my plate and found time to finish filling out BRL-
CAD's badging, which I’d started 6 months ago. Happy to be #8 in the list and
28th to get to 100%.
Congrats!! That's excellent.

Here’s a retrospective with feedback.

In all, it took about 3 interrupted hours total to gather, fact check, and write
up responses for all fields. Probably would have taken an hour
uninterrupted.
I really appreciate the write-ups. The 1 hour is consistent with what we've been seeing & estimating.

Getting to 100% passing was relatively easy for BRL-CAD with
only one MUST item arguably being unmet beforehand (our website
certificate didn’t match our domain, fixed). The rest was mostly a matter of
documentation and elaboration.
Good to hear. I would expect most well-run projects to "mostly" do well. Fixing domain certificates, and doing a little documentation & elaboration, Is (to me) a *good* thing.

Here’s my top-7 critical feedback:

1) Despite so many fields, it’s too easy to (falsely) pass. Looking at others with
100%, I would challenge some of the subjective MUST responses, and expect
to be challenged in kind. Incorporating 3rd-party review gating before
achieving 100% passing would increase overall value.
We certainly *could* add 3rd-party review gating. For example, we don’t advertise this much, but I *do* review each passing badge to look for nonsense. But it's merely a brief look for nonsense, not a rigorous re-check.

This is a fair concern, and one that's been discussed from the beginning. The reason we didn't *require* 3rd-party review gating is because we're concerned that we would become the bottleneck as the number of projects increases. There are a lot of OSS projects out there.


So the question is... how could we scale that 3rd-party review? My current position is to focus on improving automation... which would make everyone happy (faster to get a badge, more rigorous checking). Are there good scaleable alternatives?


2) Taking a position on distributed vs centralized version control is
contemporary flamebait, both with merit and downsides. There are robust
examples of both being perfectly viable, secure, and best practice. Popularity
should have no bearing on recommendations.
(This is in reference to: "It is SUGGESTED that common distributed version control software be used (e.g., git). [repo_distributed]").

I certainly agree that centralized version control is viable (having used sccs, rcs, cvs, and subversion at one time or other). The argument for this criterion is that distributed systems tend to make it easier to collaborate (because you can easily re-heal changes initiated at different times)... and since it's only SUGGESTED, it does not *mandate* decentralized version control.

That said, we could certainly drop this criterion. It's only SUGGESTED (which hints at your next point), so it doesn't have much "oomph" - and shortening the criteria a little bit is a good thing. I think we should *expect* that as we get more projects & experience there will improvements and tightening of the criteria. The *much* more important issue is to *have* version control, and I think it's generally agreed that version control should stay as a MUST.

3) Most of the SUGGESTED items devalue the badge through dilution. Some
could graduate to SHOULD (e.g., those under Quality) while the remainder
offer little to no value (as they have no bearing on the badge and only
increase burden). I would recommend removing the non-Quality ones.
Reasonable enough. I think the SUGGESTED items have some value, because psychologically people don't like to admit they don't do something (if they think they should be doing it). But I could be mistaken.

I'd like to hear others' opinions on this. I note that Daniel Stenberg agrees with this comment.

4) Private reports MUST … be privately reportable. N/A notwithstanding, I
don’t see how this could ever be Unmet. If there’s no private reporting
mechanism, private reports are de-facto not supported.
That's certainly true in a sense. The point, though, is that the project has to *tell* reporters how to do the private reporting. A lot of projects just don't tell people how to report vulnerabilities (they've never considered the possibility), so the real goal is to get them to write it down ahead-of-time.

5) "Working build system” is not strictly defined (perhaps intentionally) but
“working” is the more questionable part. Flaky open source compilation is
the epitome of “works for me” ignorance. Nobody with a build system will
say it’s not working.
:-).

Well, I've seen some non-working build systems, but you're right, people don't want to *admit* they're not working.

Would changing "Working" to "Automated" be an improvement?


6) Treating warnings as errors shouldn’t be a suggestion. Projects SHOULD be
maximally strict, treating warnings as errors, with minimal exceptions (e.g.,
less than 1 exemption per 100 files). Frankly, I think it should be a MUST.
The problem is that it depends on the overall platform (including language and underlying OS) & the tools available. Making it a MUST would be too harsh for many cases. For example, you can find more problems by enabling more warnings, but those higher levels tend to be much noisier. We don't want to *discourage* people from using more sensitive (though noisier) tools.


7) The “BadgeApp” title on individual badging pages make for a terrible title
when sharing with others (e.g., via Facebook). Suggest something like “CII’s
Best Practices Badge for $project_name"
I presume you mean the <title>...</title> value in the HTML page, which shows up on browser tabs.

You're absolutely right, and we can quickly fix this too.

--- David A. Wheeler


New Badges! Congrats!

David A. Wheeler
 

As you can see, we have great news - more projects have badges:

* c-ares: https://bestpractices.coreinfrastructure.org/projects/291 - a dependency of curl’s

* BRL-CAD: https://bestpractices.coreinfrastructure.org/projects/66

 

Christopher Sean Morrison (BRL-CAD) has posted some feedback, which I appreciate & will respond to separately.

 

But I don't want to lose sight of the main objective - we're getting more projects in, and more projects are getting badges.

 

--- David A. Wheeler

 


Re: Finally completed badge, feedback on process

Daniel Stenberg
 

On Mon, 22 Aug 2016, Christopher Sean Morrison wrote:

2) Taking a position on distributed vs centralized version control is contemporary flamebait, both with merit and downsides.
Which critieria are you saying is taking a position on this?

3) Most of the SUGGESTED items devalue the badge through dilution. Some could graduate to SHOULD (e.g., those under Quality) while the remainder offer little to no value (as they have no bearing on the badge and only increase burden). I would recommend removing the non-Quality ones.
Having filed details on three projects so far, I agree with this.

6) Treating warnings as errors shouldn’t be a suggestion. Projects SHOULD be maximally strict, treating warnings as errors, with minimal exceptions (e.g., less than 1 exemption per 100 files). Frankly, I think it should be a MUST.
"maximally strict" is not a binary option, though. In many languages you can select levels of compliance or which standard to use for warnings. Like in C, lots of things in the older standard cause warnings in the newer etc. Compilers these days also allow warnings as help, they don't necessarily identify errors but *possible* errors. Would a single warning about an suspected source code indentation problem be a reason for a project to not follow best practices?

Also, warnings on which platforms? I work on projects that build on virtually every architecture and operating system that run on 32bits. Making them all build warning-free everywhere is if not impossible (in some cases the warnings are literally impossible to fix since they're mutually exclusive on different platforms), not something that would benefit the project or improve the source code.

My (so far) two listed 100% projects would not be 100% if this would be a MUST and say "on all platforms". And in fact, that would make it really hard for a very large number of portable C and C++ projects.

7) The “BadgeApp” title on individual badging pages make for a terrible title when sharing with others (e.g., via Facebook). Suggest something like “CII’s Best Practices Badge for $project_name"
Agreed! It is also somewhat related to: https://github.com/linuxfoundation/cii-best-practices-badge/issues/463 "Add support for large badge"

I would like a badge that is more self-explanatory. I think the current incarnation isn't very suitable stand-alone in a non-technical environment, like for example a project's front web site page.

--

/ daniel.haxx.se


Finally completed badge, feedback on process

 

I finally pushed enough off my plate and found time to finish filling out BRL-CAD's badging, which I’d started 6 months ago. Happy to be #8 in the list and 28th to get to 100%. Here’s a retrospective with feedback.

In all, it took about 3 interrupted hours total to gather, fact check, and write up responses for all fields. Probably would have taken an hour uninterrupted. Getting to 100% passing was relatively easy for BRL-CAD with only one MUST item arguably being unmet beforehand (our website certificate didn’t match our domain, fixed). The rest was mostly a matter of documentation and elaboration.

Here’s my top-7 critical feedback:

1) Despite so many fields, it’s too easy to (falsely) pass. Looking at others with 100%, I would challenge some of the subjective MUST responses, and expect to be challenged in kind. Incorporating 3rd-party review gating before achieving 100% passing would increase overall value.

2) Taking a position on distributed vs centralized version control is contemporary flamebait, both with merit and downsides. There are robust examples of both being perfectly viable, secure, and best practice. Popularity should have no bearing on recommendations.

3) Most of the SUGGESTED items devalue the badge through dilution. Some could graduate to SHOULD (e.g., those under Quality) while the remainder offer little to no value (as they have no bearing on the badge and only increase burden). I would recommend removing the non-Quality ones.

4) Private reports MUST … be privately reportable. N/A notwithstanding, I don’t see how this could ever be Unmet. If there’s no private reporting mechanism, private reports are de-facto not supported.

5) "Working build system” is not strictly defined (perhaps intentionally) but “working” is the more questionable part. Flaky open source compilation is the epitome of “works for me” ignorance. Nobody with a build system will say it’s not working.

6) Treating warnings as errors shouldn’t be a suggestion. Projects SHOULD be maximally strict, treating warnings as errors, with minimal exceptions (e.g., less than 1 exemption per 100 files). Frankly, I think it should be a MUST.

7) The “BadgeApp” title on individual badging pages make for a terrible title when sharing with others (e.g., via Facebook). Suggest something like “CII’s Best Practices Badge for $project_name"

Cheers!
Sean


Re: HTTPSWatch information

Mark Rader
 

David

The bad thing is UBUNTU. 

Mark

On Thu, Aug 18, 2016 at 5:44 PM, Wheeler, David A <dwheeler@...> wrote:

FYI, the site “HTTPSWatch” reports on sites’ HTTPS support by type of site.

 

This one is especially relevant:

  https://httpswatch.com/programming

 

The bottom line is that *some* sites are doing okay (GitHub, Fedora) by this measure, but others have a ways to go.

 

This matters because one of our criteria is HTTPS.

 

--- David A. Wheeler

 


_______________________________________________
CII-badges mailing list
CII-badges@lists.coreinfrastructure.org
https://lists.coreinfrastructure.org/mailman/listinfo/cii-badges



HTTPSWatch information

David A. Wheeler
 

FYI, the site “HTTPSWatch” reports on sites’ HTTPS support by type of site.

 

This one is especially relevant:

  https://httpswatch.com/programming

 

The bottom line is that *some* sites are doing okay (GitHub, Fedora) by this measure, but others have a ways to go.

 

This matters because one of our criteria is HTTPS.

 

--- David A. Wheeler

 


FYI: Vulnerabilities in BadgeApp dependencies were automatically detected & quickly fixed last week

David A. Wheeler
 

I don't post as much here about the "plumbing" of the BadgeApp web app, but some of you might be interested in the following.

--- David A. Wheeler

==============================

Last week (on August 11) two vulnerabilities were publicly announced in Rails. I was quickly notified about this, because we have two different processes that look for publicly-reported vulnerabilities in our dependencies ("bundle-audit" as embedded in our "rake" task, and the Gemnasium service). In this case, the "rake" task told me first.

I then quickly updated to a fixed version of rails, tested it using our test suite (which covers 98% of the code), pushed out for some additional brief testing on a mock "real" site, and soon afterwards pushed the fixed version out to production. All without anyone else noticing. Because we are ready for public reports of vulnerabilities, we don't need days to respond.

This is a good demo (I think) of why it's important to have good test suites (with reasonable coverage) & tools that report when a vulnerability is found in a dependency. The current criteria already require *some* automated testing. It's my expectation that a future higher level would add (1) a coverage requirement (for the automated testing) and (2) a requirement that there be some way to monitor dependencies so you know when a vulnerability is publicly announced in them. I'm sure it'll be challenging to get those worded well. However, I think this is a good example of why that's important.

The details: I just needed to update rails from 4.2.6 to 4.2.7.1 to fix two vulnerabilities:
* CVE-2016-6316: actionview
Possible XSS Vulnerability in Action View
https://groups.google.com/forum/#!topic/rubyonrails-security/I-VWr034ouk
* CVE-2016-6317: activerecord
Unsafe Query Generation Risk in Active Record
https://groups.google.com/forum/#!topic/rubyonrails-security/rgO20zYW33s

Hopefully this will convince you that we *do* care about the security of the BadgeApp itself, and take steps to keep it secure. More information on how we work to try to make BadgeApp secure is here:
https://github.com/linuxfoundation/cii-best-practices-badge/blob/master/doc/security.md

Anyway, I thought some people might be interested.


Re: New badge-holder: JSON for Modern C++

David A. Wheeler
 

Dan Kohn [mailto:dan@linuxfoundation.org]:
It's particularly nice that on their readme, they call out following CII best practices as one of the reasons to choose their library. Hopefully, we can be a competitive feature going forward that will cause a race toward better practices.
I *completely* agree. I would love to see a widespread race to implement better practices. There's reason to hope that we're seeing the start of it.

-- David A. Wheeler


Re: New badge-holder: JSON for Modern C++

Dan Kohn
 

It's particularly nice that on their readme, they call out following CII best practices as one of the reasons to choose their library. Hopefully, we can be a competitive feature going forward that will cause a race toward better practices.



On Thu, Aug 18, 2016 3:26 PM, Wheeler, David A dwheeler@... wrote:

Great news!  The project “JSON for Modern C++” just got a badge:

  https://bestpractices.coreinfrastructure.org/projects/289

 

They started a badge entry 4 days ago (2016-08-15 07:12:13 UTC) and got the badge just a few hours ago (2016-08-18 16:38:07 UTC).  It looks good; they have lots of clear justifications, and the project is already proudly displaying a badge (great!  They’ve earned it!).

 

I took a look at the project repo page:

  https://github.com/nlohmann/json

It looks they’re doing *lots* of good things, including CI builds with Travis, 100% test coverage, a passing Coverity scan, and using cppcheck (a common tool for analyzing C++ code).  Very impressive.

 

My congrats to the project, nice job!

 

--- David A. Wheeler

 



--
Dan Kohn <mailto:dan@...>
Executive Director, Cloud Native Computing Foundation <https://cncf.io/>
tel:+1-415-233-1000


New badge-holder: JSON for Modern C++

David A. Wheeler
 

Great news!  The project “JSON for Modern C++” just got a badge:

  https://bestpractices.coreinfrastructure.org/projects/289

 

They started a badge entry 4 days ago (2016-08-15 07:12:13 UTC) and got the badge just a few hours ago (2016-08-18 16:38:07 UTC).  It looks good; they have lots of clear justifications, and the project is already proudly displaying a badge (great!  They’ve earned it!).

 

I took a look at the project repo page:

  https://github.com/nlohmann/json

It looks they’re doing *lots* of good things, including CI builds with Travis, 100% test coverage, a passing Coverity scan, and using cppcheck (a common tool for analyzing C++ code).  Very impressive.

 

My congrats to the project, nice job!

 

--- David A. Wheeler

 


Re: Most-missed criteria for projects

Daniel Stenberg
 

On Sun, 24 Jul 2016, Wheeler, David A wrote:

I'd like feedback. My hope is that by doing this analysis we'll get a better idea of what the issues are.
This is awesome data amd I'd *love* it if we could have this information generated on a regular basis and have it shown on the site. As we're slowly increasing the number of projects, I figure it'll also get clearer exactly which criterias many of us struggle to meet.

--

/ daniel.haxx.se


Re: Lots of badging activity, OPNFV just got a badge

Dale Visser
 

I’ve been (gently) pushing the Syncthing project to put their badge on their project README. See the conversation here:

 

https://github.com/syncthing/syncthing/pull/3515

 

From: cii-badges-bounces@... [mailto:cii-badges-bounces@...] On Behalf Of Dan Kohn
Sent: Wednesday, August 17, 2016 10:09 AM
To: Wheeler, David A <dwheeler@...>
Cc: cii-badges@...
Subject: Re: [CII-badges] Lots of badging activity, OPNFV just got a badge

 

It definitely seems like we hit some sort of inflection point this month, perhaps triggered by the CII summit. Hopefully, the momentum will continue to build, especially as the badges are designed to be grow virally as other developers see them.

 

 

On Wed, Aug 17, 2016 9:58 AM, Wheeler, David A dwheeler@... wrote:

Good news, we are seeing a significant influx in the number of projects pursuing a badge:

https://bestpractices.coreinfrastructure.org/project_stats

As of late yesterday we have 215 projects pursuing a badge; 50 were at 90% or greater, and 24 were at 100% (passing).

 

It won't show on the daily stats yet, but OPNFV just got a badge:

https://bestpractices.coreinfrastructure.org/projects/164

Which means we actually have 25 passing right now (with many more that are *close*).

 

--- David A. Wheeler

 

_______________________________________________

CII-badges mailing list

CII-badges@...

https://lists.coreinfrastructure.org/mailman/listinfo/cii-badges

 

 

--

Dan Kohn <mailto:dan@...>

Executive Director, Cloud Native Computing Foundation <https://cncf.io/>
tel:+1-415-233-1000


Re: Lots of badging activity, OPNFV just got a badge

Dan Kohn
 

It definitely seems like we hit some sort of inflection point this month, perhaps triggered by the CII summit. Hopefully, the momentum will continue to build, especially as the badges are designed to be grow virally as other developers see them.



On Wed, Aug 17, 2016 9:58 AM, Wheeler, David A dwheeler@... wrote:

Good news, we are seeing a significant influx in the number of projects pursuing a badge:

https://bestpractices.coreinfrastructure.org/project_stats

As of late yesterday we have 215 projects pursuing a badge; 50 were at 90% or greater, and 24 were at 100% (passing).


It won't show on the daily stats yet, but OPNFV just got a badge:

https://bestpractices.coreinfrastructure.org/projects/164

Which means we actually have 25 passing right now (with many more that are *close*).


--- David A. Wheeler


_______________________________________________

CII-badges mailing list

CII-badges@...

https://lists.coreinfrastructure.org/mailman/listinfo/cii-badges




--
Dan Kohn <mailto:dan@...>
Executive Director, Cloud Native Computing Foundation <https://cncf.io/>
tel:+1-415-233-1000


Lots of badging activity, OPNFV just got a badge

David A. Wheeler
 

Good news, we are seeing a significant influx in the number of projects pursuing a badge:
https://bestpractices.coreinfrastructure.org/project_stats
As of late yesterday we have 215 projects pursuing a badge; 50 were at 90% or greater, and 24 were at 100% (passing).

It won't show on the daily stats yet, but OPNFV just got a badge:
https://bestpractices.coreinfrastructure.org/projects/164
Which means we actually have 25 passing right now (with many more that are *close*).

--- David A. Wheeler


Syncthing badge :-)

Dale Visser
 

It makes me happy to see this: https://bestpractices.coreinfrastructure.org/projects/88

Syncthing is my current chosen solution for keeping redundant copies and a local backup (using a portable HDD connected to a Raspberry Pi) of our family photos and other important files. I noticed they're not showing their badge yet, so I submitted a pull request.

361 - 380 of 668