Conical inductors--still $10!...

On a sunny day (Thu, 23 Jul 2020 21:40:02 +0200) it happened Jeroen Belleman
<jeroen@nospam.please> wrote in <rfcp2h$edp$1@gioia.aioe.org>:

We don\'t want productivity, in as more new versions. We
want quality, robustness and durability.

Jeroen Belleman

Yes, what solution and programming languages are suitable
depends on the application hardware,
for example a firing solution for a micro size drone will
have to have the math written for a very simple embedded system, maybe even in asm.
The same firing solution for say a jet can be done in whatever high language makes you drooling.
I like Phil Hobbs link to the story about that programmer and his use of the drum revolution time..

https://www.unisa.edu.au/Media-Centre/Releases/2020/is-it-a-bird-a-plane-not-superman-but-a-flapping-wing-drone/
For better pictures:
https://robotics.sciencemag.org/

And apart from the number of bugs in the higher level version,
the failure rate also goes up with the number of components and chip size in a system,
especially in a radiation environment.
So fro ma robustness POV my choice would be the simple embedded version, not as easy to hack as most (windows??) PCs either, less power so
greener.
 
On Saturday, July 25, 2020 at 8:34:35 AM UTC+10, John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

<snip>

We don\'t need more compute power. We need reliability and user
friendliness.

Both of which are ill-defined concepts.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.

I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or absent. But it
needn\'t crash the system, or inject viruses, or ransomware everything.

These are just different ways of having the wrong result.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

That\'s an ideal. Sadly, we don\'t live in an ideal world, and the people who promise us that we could, if only we did things their way, aren\'t to be trusted.

Many of them know that they are lying, and ones who sincerely believe their own claims are even more dangerous.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.

John Larkin lives in his own personal dark age. He doesn\'t know much and comforts himself with the delusion that everybody else is equally miserably ignorant. His grandiose self-image prevents him from noticing that this isn\'t always true.

--
Bill Sloman, Sydney
 
On Saturday, July 25, 2020 at 8:34:35 AM UTC+10, John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

<snip>

We don\'t need more compute power. We need reliability and user
friendliness.

Both of which are ill-defined concepts.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.

I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or absent. But it
needn\'t crash the system, or inject viruses, or ransomware everything.

These are just different ways of having the wrong result.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

That\'s an ideal. Sadly, we don\'t live in an ideal world, and the people who promise us that we could, if only we did things their way, aren\'t to be trusted.

Many of them know that they are lying, and ones who sincerely believe their own claims are even more dangerous.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.

John Larkin lives in his own personal dark age. He doesn\'t know much and comforts himself with the delusion that everybody else is equally miserably ignorant. His grandiose self-image prevents him from noticing that this isn\'t always true.

--
Bill Sloman, Sydney
 
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.



I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.
 
On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.
 
On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.
 
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.
 
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.
 
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.

That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.
 
On Friday, July 24, 2020 at 11:09:31 AM UTC-7, John Larkin wrote:


[ about computing bugs ]

No language will fix the mess we have. Serious hardware protection
will.

Like what? Error-correcting memory? Redundant CPUs and voting?
Nonrewritable firmware?

There aren\'t any hardware solutions to (for instance) facial-recognition that
unlocks a phone on seeing a face of a child who resembles his parent.
 
On Friday, July 24, 2020 at 11:09:31 AM UTC-7, John Larkin wrote:


[ about computing bugs ]

No language will fix the mess we have. Serious hardware protection
will.

Like what? Error-correcting memory? Redundant CPUs and voting?
Nonrewritable firmware?

There aren\'t any hardware solutions to (for instance) facial-recognition that
unlocks a phone on seeing a face of a child who resembles his parent.
 
On Friday, July 24, 2020 at 11:09:31 AM UTC-7, John Larkin wrote:


[ about computing bugs ]

No language will fix the mess we have. Serious hardware protection
will.

Like what? Error-correcting memory? Redundant CPUs and voting?
Nonrewritable firmware?

There aren\'t any hardware solutions to (for instance) facial-recognition that
unlocks a phone on seeing a face of a child who resembles his parent.
 
On 24/07/20 23:34, John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.


That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.

Agreed.

But there are no silver bullets that can \"fix the mess we
have\". \"The mess\" is too many significantly different
messes, including philosophy and human frailties.

[1] e.g. what do you mean by \"correct\"
 
On 24/07/20 23:34, John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.


That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.

Agreed.

But there are no silver bullets that can \"fix the mess we
have\". \"The mess\" is too many significantly different
messes, including philosophy and human frailties.

[1] e.g. what do you mean by \"correct\"
 
On 25/07/20 00:31, whit3rd wrote:
On Friday, July 24, 2020 at 11:09:31 AM UTC-7, John Larkin wrote:


[ about computing bugs ]

No language will fix the mess we have. Serious hardware protection
will.

Like what? Error-correcting memory? Redundant CPUs and voting?
Nonrewritable firmware?

There aren\'t any hardware solutions to (for instance) facial-recognition that
unlocks a phone on seeing a face of a child who resembles his parent.

Yup, and once we get into classifications, there are infinite
examples.

How would you classify a table that one or more people
are sitting on? Or (as in my dining room) a chair with
a potted plant on it?

Or, since dogs are four legged mammals, a dog that has
had a leg amputated?

And then there\'s the whole emerging topic of machine
learning \"adversarial attacks\".
https://medium.com/onfido-tech/adversarial-attacks-and-defences-for-convolutional-neural-networks-66915ece52e7
 
On 25/07/20 00:31, whit3rd wrote:
On Friday, July 24, 2020 at 11:09:31 AM UTC-7, John Larkin wrote:


[ about computing bugs ]

No language will fix the mess we have. Serious hardware protection
will.

Like what? Error-correcting memory? Redundant CPUs and voting?
Nonrewritable firmware?

There aren\'t any hardware solutions to (for instance) facial-recognition that
unlocks a phone on seeing a face of a child who resembles his parent.

Yup, and once we get into classifications, there are infinite
examples.

How would you classify a table that one or more people
are sitting on? Or (as in my dining room) a chair with
a potted plant on it?

Or, since dogs are four legged mammals, a dog that has
had a leg amputated?

And then there\'s the whole emerging topic of machine
learning \"adversarial attacks\".
https://medium.com/onfido-tech/adversarial-attacks-and-defences-for-convolutional-neural-networks-66915ece52e7
 
On 2020-07-24 18:34, John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.


That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.

Check out Qubes OS, which is what I run daily. It addresses most of the
problems you note by encouraging you to run browsers in disposable VMs
and otherwise containing the pwnage.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC / Hobbs ElectroOptics
Optics, Electro-optics, Photonics, Analog Electronics
Briarcliff Manor NY 10510

http://electrooptical.net
http://hobbs-eo.com
 
On Sat, 25 Jul 2020 14:51:38 -0400, Phil Hobbs
<pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-24 18:34, John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 19:09, John Larkin wrote:
On Fri, 24 Jul 2020 09:06:43 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 24/07/20 03:33, Bill Sloman wrote:
On Friday, July 24, 2020 at 4:34:25 AM UTC+10, John Larkin wrote:
On Thu, 23 Jul 2020 10:36:20 -0700 (PDT), Lasse Langwadt Christensen
langwadt@fonz.dk> wrote:

torsdag den 23. juli 2020 kl. 19.06.48 UTC+2 skrev John Larkin:
On Thu, 23 Jul 2020 17:39:57 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

On 23/07/20 16:13, jlarkin@highlandsniptechnology.com wrote:
On Thu, 23 Jul 2020 10:36:08 -0400, Phil Hobbs
pcdhSpamMeSenseless@electrooptical.net> wrote:

On 2020-07-22 20:14, John Larkin wrote:

I actually designed a CPU with all TTL logic. It had three
instructions and a 20 KHz 4-phase clock. It was actually produced, for
a shipboard data logger. MACRO-11 had great macro tools, so we used
that to make a cross assembler.

When I was a Tulane, the EE department acquired a gigantic (basically
a room full) military surplus computer that used a drum memory for
program and data. The logic modules were big gold-plated hermetic cans
that plugged in. The programmer had to distribute the opcodes at
optimal angular positions on the spinning drum.

I have a book, IBM\'s Early Computers. In early days, nobody was
entirely sure what a computer was.


It\'s a fun book, and does a lot to deflate the Harvard spin, which is
always good.

The sequel on the 360 and early 370s is a good read too, as is \"The
Mythical Man Month\" by Fred Brooks, who was in charge of OS/360, at the
time by far the largest programming project in the world. As he says,
\"How does a software project go a year late? One day at a time.\"

Obligatory Real Programmer reference:

http://www.cs.utah.edu/~elb/folklore/mel.html

Cheers

Phil Hobbs

Burroughs programmed their computers in Algol. There was never any
other assembler or compiler. I was told that, after the Algol compiler
was written in Algol, two guys hand-compiled it to machine code,
working side-by-side and checking every opcode. That was the bootstrap
compiler.

Isn\'t our ancient and settled idea of what a computer is, and what an
OS and languages are, overdue for the next revolution?

The trick will be to get a revolution which starts from
where we are. There is no chance of completely throwing
out all that has been achieved until now, however appealing
that might be.

I know of two plausible starting points...

1) The Mill Processor, as described by Ivan Godard over
on comp.arch. This has many innovative techniques that,
in effect, bring DSP processor parallelism when executing
standard languages such as C. It appears that there\'s an
order of magnitude to be gained.

Incidentally, Godard\'s background is the Burroughs/Unisys
Algol machines, plus /much/ more.


2) xCORE processors are commercially available (unlike the
Mill). They start from presuming that embedded programs can
be highly parallel /iff/ the hardware and software allows
programmers to express it cleanly. They merge Hoare\'s CSP
with innovative hardware to /guarantee/ *hard* realtime
performance. In effect they have occupied a niche that is
halfway between conventional processors and FPGA.

I\'ve used them, and they are *easy* and fun to use.
(Cf C on a conventional processor!)

We don\'t need more compute power. We need reliability and user
friendliness.

Executing buggy c faster won\'t help. Historically, adding resources
(virtual memory, big DRAM, threads, more MIPS) makes things worse.

For Pete\'s sake, we still have buffer overrun exploits. We still have
image files with trojans. We still have malicious web pages.

a tool that can cut wood can cut your hand, only way totally prevent that
is to add safety features until it cannot cut anything anymore

Why not design a compute architecture that is fundamentally safe?
Instead of endlessly creating and patching bugs.

https://en.wikipedia.org/wiki/Z_notation

is an example of that approach. It doesn\'t seem to be ruling the world at the moment.

https://en.wikipedia.org/wiki/Hoare_logic

is a bit older.

The Viper provably correct computer is more recent - 1987.

https://www.cl.cam.ac.uk/archive/mjcg/papers/cohn1987.pdf

It doesn\'t seem to got anywhere either. I heard a bit about it before we left Cambridge (UK) in 1993.

Viper was interesting - a processor with a formal
mathematical proof of correctness. RSRE absolutely did
not want people to think it might be used in missile
fire control systems, oh no, never.

IIRC they flogged it to the Australians, then the Australians
noted there was a missing step between the top level spec
and the implementation. They sued and won.

There are three problems with any component that is
mathematically proven:
- most of the system isn\'t mathematically proven
- is the initial spec \"correct\"
- it is too difficult to do in practice

- jerks will still hack ugly programs.


I don\'t remember NewSpeak, the associated programming
language, ever becoming practical

No language will fix the mess we have. Serious hardware protection
will.

No, it cannot, for deep theoretical and deep practical
reasons.

A trivial example: there\'s no way that hardware protection
can protect against some idiot using addition where
subtraction is required.

Sure, the program will report his bank balance wrong. Or abend. But it
needn\'t crash the system, or inject viruses, or ransomware everything.


That\'s not a theoretical example. A few years ago my energy
supplier set my monthly payments too low. When it noticed
that I was getting further behind with my payments, it
responded by /reducing/ my monthly payments. Rinse and
repeat another two times!

As for \"better\" languages, they help by reducing the
opportunities for making boring old preventable mistakes.

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.


Check out Qubes OS, which is what I run daily. It addresses most of the
problems you note by encouraging you to run browsers in disposable VMs
and otherwise containing the pwnage.

Cheers

Phil Hobbs

That sort of thing is just another layer of kluge. It\'s sad that it\'s
necessary.



--

John Larkin Highland Technology, Inc

Science teaches us to doubt.

Claude Bernard
 
On 25/07/20 19:51, Phil Hobbs wrote:
Check out Qubes OS, which is what I run daily.  It addresses most of the
problems you note by encouraging you to run browsers in disposable VMs and
otherwise containing the pwnage.

I did.

It doesn\'t like Nvidia graphics cards, and that\'s all my
new machine has :(
 
John Larkin wrote:
On Fri, 24 Jul 2020 23:15:27 +0100, Tom Gardner
spamjunk@blueyonder.co.uk> wrote:

snip

It should be flat impossible for any application program to compromise
the OS, or any other unrelated application. Intel and Microsoft are
just criminally stupid. I don\'t understand why they are not liable for
damages.

That\'s a bit much. For one thing, there\'s no reason for anyone to ever
release any software that exhibits any of the top CVE pathologies,
regardless of tooling or methodology.

We didn\'t - those with whom I worked. We designed against it, coded
against it and tested against it. I have no evidence of *one* actual
defect in anything I released from about 1989 onward.

I\'m excluding \"yeah, you did this, and I meant that\". Those are
perfectly understandable requirements mistakes.

What you see is the population of developers doubling every five years.

Well, I hate to be that guy, but it probably takes more than five years
to reach the journeyman phase. If I count college, it was about that for
me.

So what we do is move the goalposts and redefine \"work\" to mean
\"knitting together frameworks into deployments\".

Specifically:

Intel and Microsoft are
just criminally stupid.

They had a lot of co-conspirators, then. Perhaps you don\'t recall, but
the sheer change in cost created massive opportunity once PCs got
big enough to do real work. And nobody paid anyone for rigor in the work.


The value created was substantial.

We are in the dark ages of computing. Like steam engines blowing up
and poaching everybody nearby.

Electricity-as-power has a much higher historical body count.

--
Les Cargill
 

Welcome to EDABoard.com

Sponsor

Back
Top