It's all ha-ha until you realize that some versions of Dartmouth BASIC actually had matrix operation primitives, and so might've been a good choice for implementing GPU-accelerated linear algebra kernels. (It was also compiled; Microsoft basically established BASIC's reputation as a slow language by shipping an interpreter-only version for the Altair and later machines. As usual, Microsoft gonna Microsoft.)
Given that the go-to linear-algebra libraries for the past N decades (BLAS, Linpack, etc.) are Fortran, I'd suspect that neural-network people would be rather okay with it, esp. if it could be driven with a Python wrapper (which is how most people use BLAS and Linpack today).
BASIC is roughly to Fortran what Rust is to C++: its creators set out to design a "better Fortran", and realized that the limitations and complexities necessitated creating a whole new language.
Might as well stay off the internet on April 1st.
It's definitely an April 1st joke but it also is real?!
It's all ha-ha until you realize that some versions of Dartmouth BASIC actually had matrix operation primitives, and so might've been a good choice for implementing GPU-accelerated linear algebra kernels. (It was also compiled; Microsoft basically established BASIC's reputation as a slow language by shipping an interpreter-only version for the Altair and later machines. As usual, Microsoft gonna Microsoft.)
How would people feel about CUDA programming in a simplified Fortran? That's just a step away from BASIC.
Given that the go-to linear-algebra libraries for the past N decades (BLAS, Linpack, etc.) are Fortran, I'd suspect that neural-network people would be rather okay with it, esp. if it could be driven with a Python wrapper (which is how most people use BLAS and Linpack today).
BASIC is roughly to Fortran what Rust is to C++: its creators set out to design a "better Fortran", and realized that the limitations and complexities necessitated creating a whole new language.