Humans originally have only two fears. They are fear of loud sounds and fear of falling. All other fears are acquired, not inborn.
We grow up with fear and learn to avoid new things. We always think about maintaining the status quo. The fear of failure is the reason why adults are afraid to try new things.
If you do not take a risk, you will never fail. But unfortunately, you can’t succeed without risk. In order to make the most of your potential, there are times when you must take a risk. To live your ideal life, you have to deal with fear and take action to overcome it.
Markup is the process of adding tags (marks) to text in a document.
■Description to correctly interpret the text
Markup is not just for recognizing text data as HTML. By giving correct meaning to flat text, you can convey appropriate information regardless of human or machine. Markup is not a technique, it is a description that allows the text to be interpreted correctly, such as what role the text should be and how it should be read.
Various people use the Internet. For example, even if you are blind, if the correct markup is performed, the contents of the web page can be understood without problems by the system for reading sentence aloud.
■Also affects search results
In addition, search engines such as Google and Yahoo! interpret the marked-up text to determine the content and important themes of the page. As a result, when a user performs a search, the order of search results is also affected.
Properly marked up HTML documents can be easily understood without special design or layout. In other words, to mark up correctly is to write high-quality HTML. Blog writers should be able to write this correct HTML.
SEO (Search Engine Optimization) is a term used when aiming for a higher rank in the ranking of the displayed ranking when a search engine such as Google or Yahoo! is used.
Although it is often treated as a technique for high-level display, the original meaning is to “optimize content for search engines”. The role of the search engine is to match the user’s needs (search) with the desired information (web page).
By posting correct markup that is easy for search engines to understand and high-quality information, SEO can be expected to be effective.
Attempts to generate music on a computer have a long history. The first computer synthesized music was realized in 1950 on an Australian CSRIAC computer. Later, a series of programming languages called MUSIC-N, developed by Max Mathews in 1957, became the basis for computer-generated sound synthesis. However, computers at that time did not have the ability to play music in real time. The program calculates the waveform over time, writes the result to a sound file, and then records the mixed sound over time on a tape to make the final music work. It was very time consuming.
Only in the 1980s, by installing a board dedicated to sound processing on a computer, it will be possible to synthesize sound in real time. The ISPW board, developed and sold by IRCAM in France in the 1980s, enabled real-time sound synthesis and processing by attaching it to a NeXT computer. Computers can now process sound in real time, enabling computer-based “live electronics” that generates sound in real time and modulates the sound of live musical instruments during live performances. However, at that time, this board was very expensive and sold only a few, and it was a privilege that was only available to researchers belonging to studios and research institutions specializing in computer music, and composers requested from it.
TRON is a real-time OS that has been developed and developed by TRON Project since 1984. There are several types of TRON, including ITRON for embedded devices and BTRON for PC and PDA. ITRON is embedded in devices such as mobile phones, automatic engine controls, and video cameras. As a BTRON-specification OS that runs on a PC, an OS that can handle more characters (hundreds of thousands of characters compared to thousands of JIS characters) has been created.
UNIX is a multi-user, multi-tasking OS. UNIX was first developed at AT & T Bell Laboratories as an OS for minicomputers (around 1969), and after that, several OS was derived through various improvements, and there is now many OS named UNIX. UNIX runs on many computers, including large computers, workstations, and PCs. Linux and FreeBSD are often used as UNIX for PC.
The main advantages of UNIX are that many utility programs are convenient for program development and that there are a lot of network-related functions. The user interface was originally a command input from the keyboard, but recently, many have adopted a window system. UNIX is mainly used for research and development and server applications, as well as for personal use.
MacOS is Apple’s OS, which mainly runs on Apple’s PC hardware. There are several versions of the Mac OS, but the latest version is a single-user, multitasking OS based on UNIX. The main feature of Mac OS is the user interface by GUI like Windows. The operation usually uses a mouse with a single button. Mac OS has detailed guidelines for application interfaces, so operations are often similar even if the applications differ. In addition to daily use, Mac OS is used in fields such as music production, graphic production, and publishing.
There are two main methods for executing a high level language program: a compiler method and an interpreter method.
The compiler system is a system in which a high-level language program is converted into a machine language by a language processor called a compiler, and the obtained machine language program is executed by the CPU.
On the other hand, the interpreter method is a method in which a high level language program is interpreted and executed one by one without using a language processor called an interpreter and collectively converting the programs into a machine language.
The Internet originated in the 1969 ARPANET experiment by the US Department of Defense Advanced Research Projects Agency ARPA (now DARPA). ARPANET experiments have demonstrated that packet switching technology is effective for data communication and that data communication on a global scale can be achieved by standardizing the protocol (TCP / IP).
In the mid-1980s, the National Science Foundation (NSF) began supporting the project. In 1990, the original ARPANET was dismantled and led by the NSF network operated by NSF, and LAN servers of research facilities in the world including the United States were connected to this network. Later, this network was opened to commercial networks and developed rapidly.
[Stage 1] Specification decision stage
The first step is to define what to program. If this is not well thought out, even if a code change occurs in one place, the other code may have to be rewritten, and eventually all may be rewritten.
[Stage 2] Coding stage
The stage of writing code according to the specifications. It is also a stage where many programmer qualities are asked. In other words, whether or not a program that performs the same operation is not redundant and it is thought that bugs can be easily found depends greatly on the ability of the programmer.
[Stage 3] Quality confirmation and maintenance stage
Confirmation of the operation of the completed program and the correction stage accompanying specification changes. Even program development is the most difficult stage. It’s a patience and time-consuming task that needs to prove that the program works correctly under all conditions and doesn’t crash or break the system, so it tries every possible input.