from personal experience i have none* (since i would have to have written the same software system twice [using the past experience from either way makes this mutually exclusive]).
but a simple "productivity formal methods" query on scholar.google.com gives me: "Productivity on the project was the same or better than on comparable projects" pg 628/3, left column, second paragraph; by E.M. Clarke et al., "Formal methods: State of the art and future directions", 1996.
i think that you mean that developing systems, i.e., the programming part, is less productive using formal methods; which may very well be. but the additional time you lose because of debugging/testing you do not (or at least should not have to) need to spend on the formally correct system--given that the specifications were correct (this moves the source of error to a higher level)
They would routinely use tools such as model and proof checkers with as much ease as they use compilers.
Wing [1985] envisioned over ten years
ago the idea of specification firms, analogous
to architecture and law firms,
whose employees would be hired for
their skills in formal methods.
EDIT could you share your experience, sans figures?
regarding use of advanced techniques my experience is mostly limited to lectures i attended (several on model checking, automated theorem proving, analysis and verification, and one on verification of compilers). i figure the field is somewhat scattered into the following categories:
* model checking (error-detection/bug-discovery by using temporal logic formulas)
* interactive theorem proving (e.g. the coq-based work on verification of compilers of xavier leroy at inria, various isabelle based research projects, e.g. tum.de, cam.ac.uk, and twelf/nuprl at cmu/cornell), by formalizing denotational/operational semantics
* program correctness, axiomatic semantics (hoare triples)
using model checkers is usually a good idea if you can specify correctness properties in temporal logic formulas. i once wrote a simple and small quantification-based dsl where programmers could provide boolean predicates and set-generating functions and domain-experts would encode domain/system properties that were automatically checked or even triggered follow-up actions. i once attended a lecture given by gerard berry of esterel technologies (they do a lot of verification for airbus) and AFAIR according to him there will actually be dedicated "verification engineers" specifying formulas that ensure system correctness.
using systems that support counter-example guided abstraction refinement (CEGAR, helmut veith et al.) one can iteratively develop from imprecise abstractions.
iterative theorem proving is where i think the most important work is going on, albeit (in my perception) in an orthogonal different way. in a complex system, like a compiler for a non-toy language, correctness is one of the most important problems (e.g., in the embedded systems market, automotive industry, etc.). the german research fund is doing a lot of things here, previously verifix, now a research project named verisoft. the french are very active, too. i am not sure about the situation in north america, though. in the course on verification of compilers, our lecturer told us that many companies in mechanical engineering were not really concerned with performance but rather about compilation being erroneous (one could check the bug-databases of the common compilers for reassurance on that one) i think there will be a lot of job opportunities in the future in this market, where by means of formalizing domain knowledge/programming languages one could ensure system correctness. currently, the security related industries are champions here, but with the emerging proliferation of computing technology i can imagine a world where people are fed up with badly written software, which will force engineering to verify at least certain properties.
and last but not least, while i think that axiomatic semantics are important, i think it is almost always too time consuming to write hoare triples. i have "the science of programming" by david gries, which should help my incompetent brain to speed up here, but the truth is: it's non-trivial for many programs. this is why i feel that many things are going the "interactive theorem proving"-way.
this sums it up pretty much for me. please note that i am not an expert on these issues--though very interested. axiomatic semantics and interactive theorem proving as mentioned above are not mutually exclusive, i.e., i think that one should very well be able to ensure correctness via hoare triples proven in interactive theorem provers.
Thanks for the overview. I was hoping for some examples, but most work on it is done in academia.
"Iterative" theorem proving (though probably a typo for "interactive"), is the way this could go: "agile theorem proving". That is, where you don't need to plan ahead.
i can imagine a world where people are fed up with badly written software, which will force engineering to verify at least certain properties.
Historically, people consistently choose features, convenience and performance over well-written software. It seems that formal methods only get a foothold where there are serious consequences for failure - often legal ones.
BTW: The US/UK vs. Germany/France in computer science seems to me (from reading papers) to be pragmatism vs. theory. Reading on electromagnetism, Lord Kelvin took French scientists to task to task for their errors traceable to their algebraic and abstract methods - so this divide seems to go back a ways.
but a simple "productivity formal methods" query on scholar.google.com gives me: "Productivity on the project was the same or better than on comparable projects" pg 628/3, left column, second paragraph; by E.M. Clarke et al., "Formal methods: State of the art and future directions", 1996.
i think that you mean that developing systems, i.e., the programming part, is less productive using formal methods; which may very well be. but the additional time you lose because of debugging/testing you do not (or at least should not have to) need to spend on the formally correct system--given that the specifications were correct (this moves the source of error to a higher level)
EDIT: * none meaning no specific figures.