Languages and scripts?

Discuss all aspects of programming here.

Moderator: The Mod Squad

Languages and scripts?

Postby atang1 » Wed Aug 18, 2004 1:10 am

This is a can of worms or the box of Pendora, because of all the languages available and many srcipts for batch processing.

On the otherhand, we can master the philosophy and dependences of modules written, and compiled by scripts. Bios, pnpOS and registerOS are for sequantial arrangements of programs. Static(Linux stand alone) and complete programs are used only in drop-ins. You can add modules via internet and fit in like a charm, using modules.

Modern programming is anything but complicated. It is "write once and use foreever".

Frotunately, all languages are compiled into machine language(binary) to suit different platforms(cpu). Asemblers and disassemblers rule our lifes. But scripts are now so powerful, that automation in programming depends heavily on which script to use? PHP, Perl, Python, anyone?

But, to write operating system such as Linux without vesa standards committee, put the drivers back twenty years. Drivers should be written according to the rules of the vesa standard, so all video cards use only one universal driver or two or three with added functions. Linux sorely needs a standards committee itself to regulate the version number upgrades. If you look at any changelog, the kaos is obvious. Anyone can add his intelligence in any module to bloat it. So some codes will never meet the daylight in one hundred years. Some standards are set by popularity of the chipset. CMI8730/8 sets its own standards. So, it goes.

Good luck on this new thread.
atang1
 

Postby atang1 » Thu Aug 19, 2004 2:50 am

Once you are on the internet, programming software is relative simple.

Using html codes, you can add toolbar, radio buttons, and other features of audio and video multimedia functions pull down from other links. So, minimum tiny mini operating systems can be all you need, to get on the internet.

In which case, data compression software is most important(such as flash), also to run segmented data download accelerator. add-ons and drop-ins became on the fly installed in the dram. Currently 256 mb of memory is sufficient. But if you want to remaster your livecd OS, then eventually you need 1 gb of fast memory to remaster a cdr at the end of your session.

Software programming is changing, for the better. Messenger(on portal) software using .net stretegy servers can supply most of what you need on the fly. Why store software on your hdd?
atang1
 

Postby thomas_w_bowman » Thu Aug 19, 2004 3:51 am

A major problem in the PC Programming area is Propietary Development Tools, which result in (often intended) incompatibilities across brands/versions.

In Mainframe Programming, a great deal of stability is due to specification of what COBOL (or more recently COBOLII) should behave like. The only real differences across platforms is File handling (such as VSAM, IMS, DB2, Terminal I/O or even HTML/Web Interfaces), which can be made somewhat modular by using subprograms to handle the I/O with passed status and indicators regarding functions (Open, Read Next, Read Keyed, Write, etc.). This is also respected as OS upgrades occur (Backward Compatibility) - many of the Programs that I work with were written in the 1960's or 1970's and have been maintained to stay current (especailly those dealing with changes in Government regulations, since I work with Insurance and Banking Applications). Backward compatibility does not mean that the changing needs cannot be met, just that all that's needed is to modify that portion of the code (eg; CICS now handles HTML, but still deals with Terminal I/O as it did, but with some added features - code omitting the specification for the added features will default to "same as before the new features" thus will not require change unless the new features are needed).
In the relatively short time that PC's have been introduced, migration very often means that old code must be 'converted' again and again as development environment changes - even fairly simple constructs are made obsolete (eg; BASIC and BASICA that came with DOS are no longer supported - compiled BASIC will not run under Windows [past WIN98] at all). This is not out of functional necessity, but out of a percieved need to force customers to continue to get "the Latest" versions, at increasing cost and overhead. Some scripting such as HTML, and perhaps some versions of JAVA are beginning to show some stability - which is encoruaging.
Programmers should not have to learn micro-code to deal directly with chip-level manipulation, which will change constantly - this should be handled by drivers and to some extent the OS itself. For LINUX this is developing to be the case (rather efficiently too!), but for Microsoft's Windows the inefficient redundant and too often buggy coding is just too slow for developers to rely on - not to mention extreme lack of backward compaitbility and incompletely documented functionality (or "Features") - which also makes for unintended opportunities for some to develop viruses.
In a Mainframe, it is possible to know the specifics for every service and program to run, and unload any that are not running (to conserve resources), my Mainframe TSO Session runs in a Virtual memory space of 6MB (and where I'm working, the default is 4MB), Windows will not ever be able to participate in a Mainframe OS beause it must be not only exceedingly efficient, but also must document in detail (last date changed, load module length, etc.) what each component is and does. This is a major reason that we do not hear about viruses on Mainframes either. It is interesting to note that LINUX is now being implemented on very large Mainframe platforms, as well as 'obsolete' PC's - and for PC's may be able to offer backward compatibility far beyond anything Microsoft developed so far.
Programming is developed in layers, I work largely on the APplication layers (although I started with Assembler, using macros that manipulated Disk Read/Write heads for maximum speed - although the macros would need to keep up with microcode on Hard Drives as they changed and eventually was abandonded as a means of maintaining an Application).

Just some reflection on Programming itself, atang1, since I have read several of your insightful posts regarding LINUX and feel that you would appreciate these reflections.

Tom
Better living thru technology...
"Open the Pod Bay Doors, HAL..."
Join Folding team #: 33258
thomas_w_bowman
Black Belt 2nd Degree
Black Belt 2nd Degree
 
Posts: 2884
Joined: Fri Feb 28, 2003 2:59 pm
Location: Minneapolis, MN

Postby atang1 » Thu Aug 19, 2004 6:26 am

Definitely appreciated.

What I write about hardware and software interaction is based on the past which should be obsoleted.

Software in the past had been merely to accomplish a task with the limited capability of simple cpu and simple computer architecture. Today, with so much automation needed in features and functions, philosophy comes first. Software needs are often made into hardware specifications. Cache can be smaller, but input and output channels had to be increased for massive parallel operations.

Layers of sockets and plug-ins rule the day. Now we are looking at drop-ins and add-ons. Quality control has to be built in the software functions even for the simplest automation.

There are more data compression work to be done yet.

You can see that my conclussions are quite fundamental, while your experience has tremendous details to sort out. With all the bloated codes, people are afraid to remove any, we have to depend on tiny mini programmers to strip away all the extra frills. Then put the extras into the servers to download into browser. Embedded systems taught me a lot.

We have an Embedded system conference in Boston, September(14-17).
atang1
 

Postby atang1 » Fri Aug 20, 2004 1:07 am

People often ask me, what is quality control in software?

My question in reply is "Have you bought an airline ticket online lately?"

All the cheap airline tickets are sold by middleware doing the quality control on customer response. If you search for a schedule, you will get the lowest price, either 30 days ahead or immediate use.

If you abandon the search and later search again, the price of the ticket for the same flight went up(AA from 230 to 300, Northwest from 240 to 420).

If you are buying but went back to change a date, the price went from 230 to 220.

All airline tickets sold online use the same logic to sell tickets. Quality control is traceability. Your name is known to all the online ticket agents, the price quoted are all the same; if you search and do not purchase.

Have fun, because you can beat them, until middleware is changed or became dynamically adjusted based on many other factors.
atang1
 

Re: Languages and scripts?

Postby S33K3R » Fri Aug 20, 2004 10:10 am

atang1 wrote:But scripts are now so powerful, that automation in programming depends heavily on which script to use? PHP, Perl, Python, anyone?


Answer is not easy. you will have to do some research online and choose the one that best fits your needs and one that will be simple for you to use.
S33K3R
Black Belt 2nd Degree
Black Belt 2nd Degree
 
Posts: 2466
Joined: Thu Jan 01, 2004 3:18 am

Postby atang1 » Fri Aug 20, 2004 1:28 pm

Better buy all the latest books on script and keep them at your bedside.

Each script is designed for certain automation. They become more power in certain applications than others.

Scripts started in Dos batch processing, Dos only had a few commands, but can be powerful enough to do a few repetative tasks. Coupled with schedule in calander, it can do timely defrag or hdd house keeping.

It started that way, and each script system had scripts to do more sequential operations. So, they can be quite specialized.
atang1
 

Postby atang1 » Thu Aug 26, 2004 1:35 am

Languages are on the other hand all the same. Some languages were less popular, so they no longer can keep up with the cpu instructions. If your computer language can not compile with the latest cpu instructions, you have to work around with longer codes.

Pascal is gone. C++ is still here but shows its age. Ccbol has been modernized to work on many legacy applications. Basic interpreter is not appearing anywhere. Visualbasic is Microsoft's language, used every where every day.

Java takes javescript and compiles into machine language directly. All languages compile into machine languages based on cpu platform. So, to crossover cpu platforms, you have to have virtual machine. Jave virtual machine used to be 50,000 codes. But with new cpu instructions, virtual machines may be 90,000 codes.

Inside the secrets of languages and scripts lies the cpu instructions. So, we will be touching on all the specifications of hardware and the instructions involved. The joke is always done by IBM. Whenever, they have an emulation of some cpu, they will say except one instruction.

This happened when I asked them what is the difference between the PC370 card(two 32bit 68040 cpus made up for 64 bit operation in 1981) for IBMPC and IBM 370 mainframe computer(vintage 1970). they told me one instruction is missing or can not be emulated.
atang1
 

Postby S33K3R » Thu Aug 26, 2004 6:01 am

atang1 wrote:Languages are on the other hand all the same. Some languages were less popular, so they no longer can keep up with the cpu instructions. If your computer language can not compile with the latest cpu instructions, you have to work around with longer codes.

Pascal is gone. C++ is still here but shows its age. Ccbol has been modernized to work on many legacy applications. Basic interpreter is not appearing anywhere. Visualbasic is Microsoft's language, used every where every day.

Java takes javescript and compiles into machine language directly. All languages compile into machine languages based on cpu platform. So, to crossover cpu platforms, you have to have virtual machine. Jave virtual machine used to be 50,000 codes. But with new cpu instructions, virtual machines may be 90,000 codes.

Inside the secrets of languages and scripts lies the cpu instructions. So, we will be touching on all the specifications of hardware and the instructions involved. The joke is always done by IBM. Whenever, they have an emulation of some cpu, they will say except one instruction.

This happened when I asked them what is the difference between the PC370 card(two 32bit 68040 cpus made up for 64 bit operation in 1981) for IBMPC and IBM 370 mainframe computer(vintage 1970). they told me one instruction is missing or can not be emulated.

Is there a question or is this the answer :?
S33K3R
Black Belt 2nd Degree
Black Belt 2nd Degree
 
Posts: 2466
Joined: Thu Jan 01, 2004 3:18 am

Postby atang1 » Thu Aug 26, 2004 10:05 am

This is a statement about languages and scripts, where the cpu instructions are the foremost consideration in the selection of any langusges and scripts.

The tools must be modern enough, and complete enough to be selected to write programs for latest cpu platforms. Many people prefer one or the other but without knowledge of which instruction sets are missing. As we all know that cpus are advancing very fast in architecture and instruction sets. How do you program the new IBM G5 cpu with 10 virtual cpus?
atang1
 

Next

Return to Programming

Who is online

Users browsing this forum: No registered users and 1 guest