Home  >  Article  >  Computer Tutorials  >  Unix Philosophy Programming Principles

Unix Philosophy Programming Principles

王林
王林forward
2024-02-20 10:54:18482browse

Unix Philosophy Programming Principles

1 Unix Philosophy

Unix philosophy emphasizes practicality, derived from rich experience, and is not restricted by traditional methodologies or standards. This knowledge is more latent and semi-instinctive. The knowledge that Unix programmers accumulate through development experience can benefit other programmers.

(1) Each program should focus on completing one task and start over when encountering a new task to avoid adding new functions to the original program, resulting in increased complexity. (2) Assuming that the output of a program will become the input of another program, even if the next program is not yet known, ensure that the output does not contain irrelevant information. (3) Put the designed and written software into trial use as early as possible, and decisively abandon low-quality code and rewrite it. (4) Use tools before inefficient auxiliary means to reduce the burden of programming tasks. To strive for excellence, you must make good use of tools.

2 Coding principles

The essence of Unix philosophy is not just what the sages verbally conveyed, but is more reflected in their practices and the design of the Unix system itself. This philosophy can be summarized in a few key points:

  • Module principle: Use simple interfaces to assemble simple components.
  • Clarity Principle: Clarity is better than tact.
  • Principle of combination: Consider splicing and combination when designing.
  • Separation principle: Separate strategies from mechanisms, and separate interfaces from engines.
  • Simplicity principle: The design should be simple and the complexity should be as low as possible.
  • Principle of stinginess: Don’t write huge programs unless there is no other way.
  • Principle of transparency: The design should be visible for review and debugging.
  • Robust principle: Robustness comes from transparency and simplicity.
  • Representation principle: Overlay knowledge into data to achieve simple and robust logic.
  • Popular principle: Avoid novelty in interface design.
  • Principle of silence: If the program has nothing to say, keep silent.
  • Remedial principle: When an exception occurs, exit immediately and give sufficient error information.
  • Economic principle: It is better to spend one minute on the machine than one second on the programmer.
  • Generation principle: Avoid tearing by hand, try to write a program to generate the program.
  • Optimization principle: Have a prototype before carving, learn to walk before running.
  • Principle of Diversity: Never believe in the assertion of the so-called "only one method".
  • Expansion Principle: Design with the future in mind, and the future will come sooner than expected.
  • Follow the WeChat public account [Embedded System]
  • When you are new to software engineering, you should have a deep understanding of these principles. Although most articles promote these principles, many systems lack practical tools and traditions that prevent programmers from implementing these principles. They are often hampered by poor tooling, poor design, overwork, and redundant code.

    2.1 Module principle: use simple interfaces to assemble simple components

    The core of programming is managing complexity. Resolving bugs takes up the majority of development time. The success of a usable system is more a result of trial and error than just talent or design skill.

    Assembly language, compiled language, flow charts, procedural programming, structured programming, object-oriented, and software development methodologies are overly touted. But they increase the complexity of the program beyond the processing capabilities of the human brain.

    To successfully develop complex software, reducing overall complexity and combining it through clear interfaces of simple modules is key. This keeps the problem localized to a specific part, making it easier to improve the area without affecting the whole.

    2.2 The Principle of Clarity: Clarity is better than cleverness

    Write code with future maintenance complexity and cost in mind. Code should be easy to read and understand so that it can be easily modified and maintained by others or yourself if necessary.

    In the Unix tradition, this principle applies not only to code comments. Unix best practices also emphasize taking future scalability into consideration when selecting algorithms and implementations. Although it may be tempting to add technical complexity and confusion in order to slightly improve program performance, this approach is not worth the gain. This is not only because complex code is prone to bugs, but also because it makes it more difficult to read and maintain in the future. On the contrary, elegant and clear code is not only more stable, but also easier for others to understand and modify. This is crucial, especially since the person who may need to go back and modify this code years from now may be you.

    Never struggle to decipher an obscure piece of code three times. You may get away with it the first time, but if you find that you have to reinterpret it again - it's been too long since the first time and you can't recall the specific details, then it's time to comment out the code, so that the third time will be relatively less painful.

    2.3 Combination Principle: Consider splicing and combination when designing

    If programs cannot effectively communicate with each other, then the software will inevitably fall into a quagmire of complexity.

    In terms of input and output, Unix tradition strongly advocates the use of simple, textual, stream-oriented, device-independent formats. Under classic Unix, most programs adopt the form of simple filters as much as possible, that is, process an input text stream into a simple text stream output. Contrary to conventional wisdom, Unix programmers prefer this approach not because they hate graphical user interfaces, but because programs are extremely difficult to interface with if they don't use simple text input and output streams.

    Text streams in Unix are to tools what messages are to objects in an object-oriented environment. The simplicity of the text flow interface enhances the tool's encapsulation. Many sophisticated inter-process communication methods, such as remote procedure calls, tend to involve too many processes.

    To make a program composable, it is necessary to make the programs independent of each other. Programs on one end of the text stream should, as much as possible, have no regard for programs on the other end of the text stream. It should be easy to replace a program on one end with a completely different one without disturbing the other end at all. GUI can be a good thing. Before making a GUI, you should think about whether you can separate the complex interactive program from the algorithm program that does the rough work. Each part can be made into a separate piece, and then use a simple command flow or application protocol to combine them. combine it all toghther.

    Before conceiving a sophisticated data transmission format, it is necessary to check on the spot whether a simple text data format can be used; the cost of parsing a little bit of format is the benefit of being able to use general tools to construct or interpret the data stream. worth it.

    When a program cannot naturally use serialized, protocol-based interfaces, the correct Unix design is at least to organize as many programming elements as possible into a set of well-defined APIs. In this way, applications can at least be called through links, or different interfaces can be glued together according to the needs of different tasks.

    2.4 Separation principle: Separate strategy from mechanism, interface from engine

    Strategies and mechanisms change according to different time scales, and strategies change much faster than mechanisms. Combining strategy with mechanism has two negative effects: first, it will make the strategy rigid and difficult to adapt to changes in user needs; second, it also means that any change in strategy is likely to shake the mechanism. On the contrary, stripping the two away may not be enough to break down the mechanism when exploring new strategies. In addition, it is easier to write better tests for the mechanism.

    A method to achieve stripping is to divide the application into front-end and back-end processes that can collaborate and communicate through a dedicated application protocol on the upper layer of the socket. Front-end implementation strategy, back-end implementation mechanism. Compared with the overall implementation using only a single process, this double-ended design greatly reduces the overall complexity and is expected to reduce bugs, thereby reducing the life cycle cost of the program.

    2.5 Principle of Simplicity: The design should be simple and the complexity should be as low as possible

    Pressures from many aspects often make programs more complex (and thus more costly and buggy). One of the pressures comes from technical vanity. Programmers are smart and often pride themselves on their ability to play with complex things and abstract concepts, and rightfully so. But because of this, they often compete with their peers to see who can tinker with the most intricate and beautiful things. Their design capabilities greatly exceed their implementation and debugging capabilities, and the result is expensive scrap.

    "Intricately beautiful things" sounds like a contradiction. Unix programmers compete with each other to see who can be "simple and beautiful." Although this is only implicit in these rules, it is worth mentioning publicly and emphasizing.

    At least in the world of business software, excessive complexity often comes from project requirements, and these requirements are often based on promotional hot spots rather than based on customer needs and what the software can actually provide. Many good designs are killed by a long list of features that are marketed as features that are almost never used. Then, a vicious cycle begins. The way to be more fancy than others is to become even more fancy yourself. Soon, bloat became the industry standard. Everyone was using bloated software with so many bugs that even software developers didn't dare to take it seriously.

    The only way to avoid these pitfalls is to encourage another software culture where simplicity is beauty. This is an engineering tradition that values ​​simple solutions, always tries to break down a program system into small parts that can work together, and instinctively resists any attempt to sugarcoat a program with too many gimmicks.

    2.6 Principle of stinginess: Don’t write huge programs unless there is no other way

    "Big" has two meanings: large size and high complexity. The larger the program, the harder it is to maintain. The difficulty in parting with something that took a lot of effort to create results in a waste of investment in a huge program that is destined to fail or is not the best solution. Avoid unnecessary code and logic and keep your code lean.

    2.7 Transparency Principle: The design should be visible for review and debugging

    Because debugging often takes up three-quarters or more of development time, it pays to do a little more work at the beginning to reduce the amount of debugging work later. An effective way to reduce debugging workload is to fully consider transparency and visibility when designing.

    Transparency of a software system means that you can see at a glance what the software is doing and how it is doing it. Visibility means that a program has the ability to monitor and display its internal state, so that not only does the program run well, but it can also be seen in what way it runs.

    If these requirements are fully considered during design, it will bring benefits to the entire project process. The setting of debugging options should not be done after the fact, but should be considered at the beginning of the design. The program should not only be able to demonstrate its correctness, but also be able to inform latecomers of the original developer's problem-solving thinking model.

    If a program wants to demonstrate its correctness, it should use an input and output format that is simple enough to ensure that it is easy to check whether the relationship between valid input and correct output is correct. For the sake of transparency and visibility, a simple interface should also be promoted to facilitate other programs to operate it, especially test monitoring tools and debugging scripts. Follow the WeChat public account [Embedded System]

    2.8 Robustness Principle: Robustness comes from transparency and simplicity

    The robustness of software means that the software can not only run well under normal circumstances, but also run well under unexpected conditions beyond imagination.

    Most software cannot withstand collisions and has many problems because it is too complex and difficult to consider all aspects. If you cannot correctly understand the logic of a program, you cannot be sure that it is correct, and you cannot fix it when something goes wrong. The way to make a program robust is to make the internal logic of the program easier to understand. There are two main ways to do this: transparency and simplicity.

    In terms of robustness, it is also important to design with the ability to withstand extreme inputs. In the case of abnormal input, a very important strategy to ensure the robustness of the software is to avoid special cases in the code. Bugs are usually hidden in the code that handles special cases and the code that handles the interactive operations of different special situations.

    The transparency of software means that you can see what is going on at a glance. A program is simple if "what's going on" is not complex, that is, if all possible scenarios can be deduced without racking one's brains. The simpler and more transparent the program, the more robust it will be.

    Modularization (simple code, simple interface) is a method of organizing programs to achieve more concise purposes.

    2.9 Representation Principle: Overlay knowledge into data to achieve simple and robust logic

    Data is easier to control than programming logic. In design, the complexity of code should be actively transferred to data.

    This consideration is not original to Unix, but many Unix codes appear to be influenced by it. In particular, the function of C language to control the use of pointers promotes the dynamic modification of reference structures at various coding levels above the kernel. Tasks that can be accomplished with very simple pointer operations in structures often require more complex procedures in other languages.

    When doing data-driven programming, you need to clearly separate the code and the data structure that the code acts on. In this way, when changing the logic of the program, you only need to edit the data structure instead of the code. Data-driven programming is sometimes confused with object-oriented programming, another style centered on data organization. There are at least two differences between them. First, in data-driven programming, data is not just the state of an object, but actually defines the control flow of the program; second, object-oriented first considers encapsulation, while data-driven programming values ​​​​writing as much as possible. Less fixed code.

    2.10 Popular Principle: Avoid novelty in interface design

    It is also known as the "principle of least surprise". The easiest-to-use program is one that requires the user to learn the fewest new things and is the program that best matches the user's existing knowledge. Therefore, interface design should avoid unwarranted novelty and cleverness.

    If you program a calculator, ' ' should always mean addition. When designing interfaces, try to model them according to the same functional interfaces and similar applications that users are most likely to be familiar with.

    Focus on the target audience, they may be end users, they may be other programmers, they may be system administrators. Least surprising means different things to these different groups of people. Focus on traditional conventions, which exist for a good reason: to ease the learning curve.

    The other side of the principle of minimum innovation is to avoid appearing similar but actually being slightly different. This can be extremely dangerous because apparent similarity often leads people to make erroneous assumptions. So it's better to have different things that are clearly different, rather than looking almost identical.

    2.11 Principle of silence: If the program has nothing to say, keep silent

    Well-behaved programs should work silently and never chatter. Silence is golden. This principle originated from the fact that when Unix was born, there were no video monitors. Every line of redundant output would seriously consume the user's precious time. This situation no longer exists, but the fine tradition of keeping everything simple continues to this day.

    Simplicity is the core style of Unix programs. Once the output of a program becomes the input of another program, it is easy to single out the required data. From a human perspective, important information should not be mixed in with lengthy information about the internal behavior of the program. If the displayed information is all important, then there is no need to look for it. A well-designed program treats the user's attention as a limited and valuable resource, requiring its use only when necessary to avoid disturbing the user with unnecessary information.

    2.12 Remedy principle: When an exception occurs, exit immediately and give sufficient error information

    The software should have the same transparent logic when an error occurs as it does under normal operation. The best-case scenario is, of course, that the software can adapt and cope with abnormal operations; but the worst-case scenario is if the remedial measures are obviously unsuccessful, but silently bury the risk of crashes, which are not revealed until much later.

    Therefore, the software must deal with various incorrect inputs and its own running errors as calmly as possible. If it cannot do this, let the program terminate as much as possible in a way that makes it easy to diagnose the error.

    "Receive with tolerance and send with caution." Even if the input data is not standardized, a well-designed program will try to understand its meaning and cooperate with other programs as much as possible; then, it will either collapse loudly or output a rigorous, clean and correct data for the program in the next link of the work chain.

    Tolerance should be considered when designing, instead of using overly indulgent implementations to make up for the shortcomings of the standard, otherwise you will die ugly if you are not careful.

    2.13 Economic principle: It is better to spend one point on the machine than one second on the programmer

    In the early minicomputer era of Unix., this view was quite radical; with the development of technology, development companies and most users can obtain cheap machines, so the rationality of this criterion goes without saying. .

    Under the premise of ensuring quality, try to use computer resources to complete tasks and reduce the burden on programmers. Another way to significantly save programmers' time is to teach the machine how to do more low-level programming work. Follow the WeChat public account [Embedded System]

    2.14 Generation Principle: Avoid tearing by hand, try to write a program to generate the program

    It is well known that humans are terrible at doing hard detail work. Any manual work in a program is a breeding ground for errors and delays, and program-generated code is almost always cheaper and more trustworthy than hand-written code.

    For code generators, repetitive and numb high-level language codes that require handwriting can be mass-produced just like machine codes. Using a code generator pays off when it increases the level of abstraction, that is, when the generator's declarative statements are simpler than the generated code, and the generated code eliminates the need for laborious manual processing. Code generators are used extensively in Unix to automate detailed, error-prone work.

    2.15 Optimization principle: Have a prototype before carving, learn to walk before running

    The most basic principle of prototype design is, "90% of the functions can be realized now, which is better than 100% of the functions never being realized." Prototyping well can avoid investing too much time just for a small profit.

    "You should not consider the efficiency improvement of petty gains. Premature optimization is the root of all evil." Rushing to optimize without knowing where the bottleneck is may be the only mistake that damages the design more than adding random functions. From deformed code to disorganized data layout, the one-sided pursuit of speed at the expense of transparency and simplicity breeds countless bugs and consumes millions of people's time. This small benefit is far from offsetting the subsequent troubleshooting efforts. cost.

    Premature local optimization can actually hinder global optimization, thereby reducing overall performance. Modifications that would bring greater benefits to the overall design are often interfered with by a premature local optimization, resulting in a product with poor performance and overly complex code.

    In the Unix world, there is a very clear and long tradition: prototype first, then refine. Before optimizing, make sure it can be used. First, you can walk, then learn to run. Effectively extending this from a different culture: run first, be right next, and go fast last.

    The essence of all these words actually means the same thing: first design an unoptimized, slow, memory-consuming but correct implementation, and then make systematic adjustments to find those things that can be obtained by sacrificing minimal local simplicity. A big performance improvement.

    2.16 Principle of Diversity: Never believe in the assertion of the so-called "only one method"

    Even the best software is often limited by the imagination of its designers. No one is smart enough to optimize everything, nor can they foresee all possible uses of software.

    Regarding software design and implementation, one good thing about Unix tradition is that it never believes in any so-called "one-size-fits-all approach". Unix pursues the widespread use of multiple languages, open scalable systems, and user customization mechanisms; it absorbs and draws on various excellent design ideas, and continuously improves its own design methods and styles.

    2.17 Expansion Principle: Design with the future in mind, the future is always faster than expected

    Leave room for expansion of data formats and code, otherwise, you will often find yourself tied up by original unwise choices, because you cannot change them while maintaining compatibility with the original.

    When designing a protocol or file format, it should be sufficiently self-descriptive to be extensible. Either include a version number, or use independent, self-describing statements to organize the format in such a way that new ones can be inserted and old ones swapped out at any time without breaking the code that reads the format. Unix experience shows that by slightly increasing the overhead of making data deployment self-describing, you can scale without destroying the whole, and a small effort can be rewarded thousands of times.

    When designing code, it should be well organized so that future developers can add new features without tearing down or rebuilding the entire architecture. This principle does not mean that you can add functions that are not used at will, but that you should consider future needs when writing code to make it easier to add functions in the future. The program interface should be flexible. Add the comment "If expansion...requires..." in the code. You are obliged to do something good for those who use and maintain the code you wrote in the future. Maybe you will maintain the code yourself in the future and design it with the future in mind. What you save may be your own energy.

    3 Applying Unix Philosophy

    These philosophical principles are by no means vague and general. In the Unix world, these principles come directly from practice and form specific rules.

    Using the Unix philosophy, we should constantly pursue excellence. Software design is a craft worthy of wisdom, creativity, and passion. Otherwise, you will not go beyond those simple and old-fashioned designs and implementations; you will rush to program when you should be thinking, you will complicate the problem when you should ruthlessly cut out the complex and simplify, and you will complain about why the code is so Bloated and difficult to debug.

    To make good use of the Unix philosophy, never act recklessly; use more skill and save your energy for use when needed. Use good steel on the blade. Take advantage of tools and automate everything as much as possible.

    4 Attitude

    Software design and implementation is an art full of joy, a high-level game. Why should you engage in software design instead of anything else? Maybe now it is just to make money or pass the time, or maybe you once thought that software design changes the world and is worth the passion.

    The above is the detailed content of Unix Philosophy Programming Principles. For more information, please follow other related articles on the PHP Chinese website!

    Statement:
    This article is reproduced at:mryunwei.com. If there is any infringement, please contact admin@php.cn delete