Looking for the perfect programming language for writing your application? And so many experts told you that there isn’t one, only certain tools for certain tasks. Yes there is, it’s your imagination. You just have to know how to share it with the others.
The types paradigm
Historically, programming languages divided between two major categories: a smaller subset of dynamic or typeless languages such as PHP, JavaScript, Python or Ruby, and a larger one of typed languages starting from C, C++, Pascal, Fortran, Java, C#, Go or Rust and reaching to type-augmented languages like TypeScript or Dart. Yet, the dynamic languages do use types for data manipulation and even perform some type-checking at run time, while the typed or static languages usually get a binary form when running, where type-checking isn’t performed by default for the memory zones that were variables or data structures, which already got checked at compile time. So, types are always meant when we’re coding, but not so relevant for the machine executing the code. This is because type is a human concept sitting near similar ones like category, kind, sort, class and so on. Although static languages have used the stricter syntax to perform better error checking, ahead-of-time compilation and optimizations, dynamic languages have caught up in features and performance using techniques like just-in-time compilation and execution paths. For a limited set of types of a variable in code, most of their use-case scenarios invalidate when actually executing the program.
In a more natural programming language, the variable naming would have word meaning, so that the bundler/compiler can look up in a human language dictionary in order to perform additional type detection. We, as humans, take advantage of the other sensorial information, knowledge and context to even better infer the types of the objects. A smart compiler, parser or IDE can use interaction /prompting or AI assistance to complete the type identification at development.
The long time debate of the syntactical expression of the types throughout the code may become, after all, simply a matter of to type or not to type.
A good idea revived
As described in TypeScript’s Lite project page, this Node.JS package (TSLite) converts a JavaScript source tree into a TypeScript source tree. This is possible because TSLite transparently uses one of the simplest form of type annotations: the prefix notation.
The idea of the prefix naming, also called the Hungarian notation, was first brought up several decades ago by large software companies for some of their in-house platforms such as Microsoft MFC, Win32 API, or Borland OWL. Because the notation was used at platform level, and companies insisted in a canonical construction of the required prefixes, pretty soon developers confronted to referencing a large amount of prefix items made mainly of consonants that made them joke about learning in fact the spelling of a foreign language, and the naming practice faded away.
By contrast to the original implementation, which still has the merit of a good beginning, the TSLite prefix notation has the following key features:
- Prefixes are user-defined at project level via a configuration file. It’s also possible to specify prefixes at file level via a dedicated block comment.
- User can choose lowercase prefixes of any length, and even by regex pattern match. These prefixes apply to built-in or user-defined variable types, but also to specific chosen variables.
- In CommonJS and ES modules, the imported and exported variables and top-level functions can and should be excepted from the prefix naming. typescript will infer the types from the imported modules and from function signatures. No prefix notation will be applied to object/class/map members. For these structures, one can use external TS definition files or structural JSDoc comments.
An example of a conventional Express.JS app stub that gets converted to TypeScript based on variable prefixes and a config file, is shown below.
app.js
/* tslite-add * interface Request { app: any; } * interface Response { locals: any; render: Function; } */ const cookieParser = require('cookie-parser') const express = require('express') const httpErrors = require('http-errors') const logger = require('morgan') const path = require('path') const indexRouter = require('./routes/index') /// @ts-ignore const app = express() app.set('views', path.join(__dirname, 'views')) // view engine setup app.set('view engine', 'ejs') app.use(logger('dev')) app.use(express.json()) app.use(express.urlencoded({ extended: false })) app.use(cookieParser()) app.use(express.static(path.join(__dirname, 'public'))) app.use('/', indexRouter) // catch 404 and forward to error handler app.use((req, res, next) => { next(httpErrors(404)) }) // error handler app.use((err, req, res, next) => { // set locals, only providing error in development res.locals.message = err.message res.locals.error = req.app.get('env') === 'development' ? err : {} // render the error page /// @ts-ignore res.status(err.status || 500) res.render('error') }) module.exports = app
app.ts
interface Request { app: any; } interface Response { locals: any; render: Function; } const cookieParser = require('cookie-parser') const express = require('express') const httpErrors = require('http-errors') const logger = require('morgan') const path = require('path') const indexRouter = require('./routes/index') /// @ts-ignore const app = express() app.set('views', path.join(__dirname, 'views')) // view engine setup app.set('view engine', 'ejs') app.use(logger('dev')) app.use(express.json()) app.use(express.urlencoded({ extended: false })) app.use(cookieParser()) app.use(express.static(path.join(__dirname, 'public'))) app.use('/', indexRouter) // catch 404 and forward to error handler app.use((req: Request, res: Response, next: Function) => { next(httpErrors(404)) }) // error handler app.use((err: any, req: Request, res: Response, next: Function) => { // set locals, only providing error in development res.locals.message = err.message res.locals.error = req.app.get('env') === 'development' ? err : {} // render the error page /// @ts-ignore res.status(err.status || 500) res.render('error') }) module.exports = app
tslite.json
{ "input": ["express-app", "sample.js"], "output": "../src-ts", "matcher": [ "/\\.js(x?)$/i", ".ts$1" ], "prefixes": { "a": "any[]", "b": "boolean", "f": "Function", "n": "number", "o": "any", "s": "string", "$e[0-9]*": "any", "$[i-nx-z][0-9]*": "number", "req": "Request", "res": "Response", "$next": "Function", "$err": "any" } }
tsconfig.json
{ "compilerOptions": { "outDir": "dist", "strictNullChecks": false, "target": "es2016", "module": "commonjs", "esModuleInterop": true, "forceConsistentCasingInFileNames": true, "strict": true, "skipLibCheck": true } }
Being generated by an automated tool (yeoman), the JS code is kept unaltered, but you can easily apply the prefix notation to all desired variables and immediately see the additional resulting TS typings.
Although the TSLite package is a Node.JS console app, it can be run uniformly on a mobile, browser, server or any ES5.ES6project code. A typical workflow to choose is: first write your ES6 prefixed code and verify it with ESLint, second convert the code with TSLite and verify it with TSC.
As this toolset allows you to write robust JavaScript code that gets converted in one step in any TS annotation dialect (like e.g. AssemblyScript), you can use the prefix notation to automatically map to a typed/static source code.
Stenography
People have used the shorthand notation, also called stenography, since the classical Greek antiquity (ca. 400 BCE), to record spoken messages in writing with a speed comparable to that of the speaker. Departing from the Greek and Roman era, stenography has greatly evolved in the 18th and 19th centuries, with forms of shorthand based on geometrical shapes (circles, lines, dots), and then on symbols for phonemes (spoken sounds). The latter standardized and has been adopted in a few decades in over 16 languages from Europe, America and Asia. In the same time, stenography, orthography (of punctuation marks and abbreviations), and the symbolic notation in sciences borrowed heavily one from another, while maturing and becoming standard methodologies. What started as a need for speed in writing, became a universal approach to convey meaningful, concise, and precise information for people all over the world.
A century later, the creators of the 3rd generation programming languages drew from the principles of the shorthand and symbolic notations, building the first human-like syntax of what was still perceived at the time as a wrapper around computational logic and machine instruction sets. Due to their universality, some of these languages have become the backbone of modern programming, being adopted in every corner of the world, where they popularized a common technological advancement and the slang that we call today “IT English”.
It is supposed that coding tools mainly solve technical problems, yet they are also, everywhere, a powerful way of collaboration and communication, as the shorthand and symbolic notations have already been a century before the digital era.
The Script
Taking into account the previous arguments, one can say that the general purpose programming language of the near future will have the following key features:
- - It comes from a human-centric approach based on Math and Physics as opposed to the one based on the machine instruction set. Current algorithmics, tools and AI are able to translate from the human expression syntax to the computer logic. It is expected that the future programming of most system will solely concern to the business problem, while allowing a full range of human inputs besides writing.
- It uses short words, abbreviations (e.g. of simplified International English), and symbols for keywords and operators, keeping an easy to understand language vocabulary while adhering to the shorthand notation principle of writing concisely and precisely. The code is written in some kind of universal language, but it reads aloud in the natural language of the reader, much like the pseudo-code is read today.
- It targets both interpreted and compiled runtimes. The original human inputted code is considered the source of truth. For any target, an augmented code may be generated automatically or by user assistance. The augmented code is link to segments or logical units (of a factor of several lines) of the original code. Modifying a segment of the original code invalidates the corresponding fragment of the augmented code, that will need to be regenerated depending of the target requirements.
- It has a framework of controlling data structures and program flow called the program constraints. This program constraints framework describes and configures the followings: data types, objects’ shape (structure), field validation, error hierarchy, module priority (sequence), concurrency configuration etc. it is secondary and applies immediately after the original code, thus affecting any augmented code. It is available to rely on at runtime, effectively taking part in the running program.
The software that enables programming with this language has a modern set of development tools including: linting, compiling, code audit and tests generation, packing, runtime emulation (sandbox). For all stages of outputting some form of code or a program there is traceability, and the human factor can intervene if necessary.
The language, that we’ll simply call “The Script”, makes it easy to use high order data structures and algorithms (like in C++ STL), asynchronous and web services wrappers, various tools for data processing and AI, by including their code in its user standard library.
Many general purpose programming languages of today acquired built-in optimizations for compiling or just-in-time, sleek and freely available libraries for common domains of interest,, and good ergonomics for the developer aiming at complex problem-solving. Their basic learning has become trivial because of the AI code generating, and their inter-changeability evident because of the excellent capabilities of today’s code transpilers. Instead of learning ten dialects or ten ways of using an advanced instrument which is our software solution platform, one can concentrate at mastering the general purpose language that targets all common runtimes either compiled or interpreted, and offers flexible optimization settings, specialized code libraries for business and technology, and a universal environment ready to implement any creative Endeavour.
Full circle
Imagine that on your computer keyboard you also have two rows of keys which are regular shapes and known symbols like those we currently use. Any of these twenty symbol keys starts and completes a sequence of maximum three keystrokes that will become finally a geometrical symbol. The first keystroke will draw the typed shape as a lower part at the typing position, the second keystroke will draw the typed shape or symbol as the upper part maybe with a temporary slight overlapping, and the third keystroke will draw as superscript or subscript (by a key modifier), within the same geometrical symbol that will take the place of one character. From the first keystroke, a popup list will appear near the typing position containing final geometrical symbols sorted by similarity to the progressing typing sequence. This is a single-selection list where each item is a geometrical symbol followed by a word in the natural or preferred language of the user, but with the same global meaning when translated. The selection within the list moves or refines as you type the sequence or by user navigation (arrow keys, mouse, tapping, dictation, etc.). The sequence is terminated either by the third keystroke, or by pressing a terminator key (e.g. enter) or a character/delimiter key other than those twenty, or by choosing an item from the list.
The final symbol isn’t a direct juxtaposition of the typed shapes, but is rather hinted by their succession and of their composition, looking and acting in fact as a truetype font. Examples of such a final symbols may be: a square followed by a sitting triangle gives the ‘ house’ symbol, a square followed by an upside-down triangle gives the ‘ tree’ symbol, an underline followed by a circle gives the ‘ sunrise’ symbol, an underline followed by a circle and then by an asterisk gives the ‘ sunset’ symbol.
The inputted symbolic script will also carry a word transcript in a default spoken language (e.g. English) which translates word-to-word to the words chosen by the typing user.
When used in coding, this typing method allows the use of geometrical symbols everywhere when we use words for naming like in variables, functions, classes, member names, etc. this can work with the actual code while keeping the text format for the exchanged data like JSON or XML.
Remember that the Latin script descends in order from Greek, Phoenician, Aramaic, and ultimately from the Egyptian hieroglyphs, and the abstract sketching predates writing as a concept of the human mind. Also, various sci-fi depictions show a group of geometric shapes or symbols as a controls sequence for activating some distant future technologies.
It is said that the most perfect shape in the universe is the circle, hence its constant use in arts, beauty, and even sciences. The planets in every gravitational field of their star fly into the space with least energy los on closed and almost circular paths. Even the tiny electron goes round around atom’s nucleus inside a closed strip, called energy level.
It seems that the circular path is one of the fundamental laws of the universe. Some people say that the human history itself tends to repeat in cycles after a couple of hundred or thousand years.
But some of us also noticed that planets when closing their path do not return in the same point of the universe because the stars and entire galaxies have their traveling paths. And the tiny electron has a tendency of jumping from an energy level to another, and even leaving the atom. If you are drawing a circle in the sand, it won’t close in the same point in space because the planet has been revolving in the meantime. Circles of living beings evolve themselves into spirals.
So we’ve learned that there is a second fundamental law of the universe, at least as important as the first, called evolution.
For those of us who value the human knowledge, there must be no place of being afraid of closed paths, or that history might repeat itself.