Home  >  Article  >  Web Front-end  >  Optimization tips! ! Front-end rookie speeds up interface by 60%

Optimization tips! ! Front-end rookie speeds up interface by 60%

coldplay.xixi
coldplay.xixiforward
2020-11-11 17:28:012163browse

javascriptThe column introduces the techniques used by front-end rookies to speed up the interface by 60%.

"Optimization

Background

I haven’t written an article for a long time, and I have been silent for more than half a year

Continuous malaise, intermittent epileptic seizures

I come to my uncle every day and spend every day in confusion and anxiety

I have to admit that I am actually a waste

As a low-level front-end engineer

I recently dealt with an old interface that has been handed down for more than ten years

It inherits all the supreme complexity logic

It is said that one call can increase the CPU load by 90% every day.

Specialized in treating various dissatisfactions and Alzheimer's disease

Let's appreciate the time-consuming of this interface

""

##The average call time is more than 3s

Resulting in serious chrysanthemum reversal on the page

After various in-depth analysis and Q&A with professionals

The final conclusion is: give up medical care

Lu Xun in "Diary of a Madman" "

The only things that can defeat me are women and alcohol, not bugs"

Whenever I am in darkness

this sentence always It allows me to see the light

So I have to be tough this time

I decided to make a node proxy layer

Use the following three methods to optimize:

  • Load on demand-> graphQL

  • ##Data cache-> redis

  • Polling update-> schedule

  • Code address: github

Load on demand-> graphQL

There is a problem with Tianxiu's old interface. Every time we request 1,000 pieces of data, each piece of data in the returned array has hundreds of fields. In fact, our front end only uses 10 of them.

How to extract any n fields from more than a hundred fields, this uses graphQL.

GraphQL only needs three steps to load data on demand:

Define the data pool root
  • Describe the data structure schema in the data pool
  • Customize Query data query
  • Definition of data pool

We define a data pool for the scene where the diaosi pursues the goddess, as follows:

// 数据池var root = {    girls: [{        id: 1,        name: '女神一',        iphone: 12345678910,        weixin: 'xixixixi',        height: 175,        school: '剑桥大学',        wheel: [{ name: '备胎1号', money: '24万元' }, { name: '备胎2号', money: '26万元' }]
    },
    {        id: 2,        name: '女神二',        iphone: 12345678910,        weixin: 'hahahahah',        height: 168,        school: '哈佛大学',        wheel: [{ name: '备胎3号', money: '80万元' }, { name: '备胎4号', money: '200万元' }]
    }]
}复制代码

There are all the data of the two goddesses in it Information, including the goddess’s name, mobile phone, WeChat, height, school, spare tire collection and other information.

Next we will describe these data structures.

Describe the data structure in the data pool

const { buildSchema } = require('graphql');// 描述数据结构 schemavar schema = buildSchema(`
    type Wheel {
        name: String,
        money: String
    }
    type Info {
        id: Int
        name: String
        iphone: Int
        weixin: String
        height: Int
        school: String
        wheel: [Wheel]
    }
    type Query {
        girls: [Info]
    }
`);复制代码

The above code is the schema of the goddess information.

First we use

type Query

to define a query for goddess information, which contains a lot of information about girlsInfo. This information is a bunch of arrays, so It’s [Info]We describe the dimensions of all information about a girl in

type Info

, including name, mobile phone (iphone), and WeChat (weixin), height (height), school (school), spare tire set (wheel)Define query rules

After getting the information description (schema) of the goddess, you can customize it Various messages from the goddess are combined.

For example, if I want to get to know a goddess, I only need to get her name and Weixin. The query rule code is as follows:

const { graphql } = require('graphql');// 定义查询内容const query = `
    { 
        girls {
            name
            weixin
        }
    }
`;// 查询数据const result = await graphql(schema, query, root)复制代码

The filtering results are as follows:

""Another example is if I want to further develop with the goddess, I need to get her spare tire information, query Find out how much money her spare wheels have, and analyze whether you can get priority in choosing a spouse. The query rule code is as follows:

const { graphql } = require('graphql');// 定义查询内容const query = `
    { 
        girls {
            name
            wheel {
            	money
            }
        }
    }
`;// 查询数据const result = await graphql(schema, query, root)复制代码

The filter results are as follows:

"" We use the example of the goddess to show how to load data on demand through graphQL.

Mapped to the specific scenario of our business, each piece of data returned by the Tianxiu interface contains 100 fields. We configure the schema to obtain 10 of the fields, thus avoiding the remaining 90 unnecessary fields. transmission.

Another benefit of graphQL is that it can be configured flexibly. This interface requires 10 fields, another interface requires 5 fields, and the nth interface requires another x fields.

According to the traditional To do this, we need to make n interfaces to satisfy all situations. Now we only need one interface configured with different schemas to satisfy all situations.

Enlightenment

In life, we licking dogs really lack the idea of ​​graphQL loading on demand

Scumbag men and scumbag women, each gets what he needs

Your true feelings are not worth mentioning in front of the ladies

We must learn to do what they like

Show your car keys when you come up, and show off your talents if you don’t have a car

Tonight I I have an ancestral chromosome that I want to share with you.

It’s ok, if not, just change to the next one

Go straight to the topic, simple and crude

Cache-> redis

The second optimization method is to use redis cache

天秀老接口内部调用了另外三个老接口,而且是串行调用,极其耗时耗资源,秀到你头皮发麻

我们用redis来缓存天秀接口的聚合数据,下次再调用天秀接口,直接从缓存中获取数据即可,避免高耗时的复杂调用,简化后代码如下:

const redis = require("redis");const { promisify } = require("util");// 链接redis服务const client = redis.createClient(6379, '127.0.0.1');// promise化redis方法,以便用async/awaitconst getAsync = promisify(client.get).bind(client);const setAsync = promisify(client.set).bind(client);async function list() {	// 先获取缓存中数据,没有缓存就去拉取天秀接口
	let result = await getAsync("缓存");    if (!result) {    	  // 拉接口
    	  const data = await 天秀接口();
          result = data;          // 设置缓存数据
          await setAsync("缓存", data)
    }   	return result;
}

list(); 

复制代码

先通过getAsync来读取redis缓存中的数据,如果有数据,直接返回,绕过接口调用,如果没有数据,就会调用天秀接口,然后setAsync更新到缓存中,以便下次调用。因为redis存储的是字符串,所以在设置缓存的时候,需要加上JSON.stringify(data),为了便于大家理解,我就不加了,会把具体细节代码放在github中。

将数据放在redis缓存里有几个好处

可以实现多接口复用、多机共享缓存

这就是传说中的云备胎

追求一个女神的成功率是1%

同时追求100个女神,那你获取到一个女神的概率就是100%

鲁迅《狂人日记》里曾说过:“舔一个是舔狗,舔一百个你就是战狼

你是想当舔狗还是当战狼?

来吧,缓存用起来,redis用起来

轮询更新 -> schedule

最后一个优化手段:轮询更新 -> schedule

女神的备胎用久了,会定时换一批备胎,让新鲜血液进来,发现新的快乐

缓存也一样,需要定时更新,保持与数据源的一致性,代码如下:

const schedule = require('node-schedule');// 每个小时更新一次缓存schedule.scheduleJob('* * 0 * * *', async () => {    const data = await 天秀接口();    // 设置redis缓存数据
    await setAsync("缓存", data)
});复制代码

天秀接口不是一个强实时性接口,数据源一周可能才会变一次

所以我们根据实际情况用轮询来设置更新缓存频率

我们用node-schedule这个库来轮询更新缓存,* * 0 * * *这个的意思就是设置每个小时的第0分钟就开始执行缓存更新逻辑,将获取到的数据更新到缓存中,这样其他接口和机器在调用缓存的时候,就能获取到最新数据,这就是共享缓存和轮询更新的好处。

早年我在当舔狗的时候,就将轮询机制发挥到淋漓尽致

每天向白名单里的女神,定时轮询发消息

无限循环云跪舔三件套:

  • “啊宝贝,最近有没有想我”
  • “啊宝贝早安安”
  • “宝贝晚安,么么哒”

虽然女神依然看不上我

但仍然时刻准备着为女神服务

结尾

经过以上三个方法优化后

接口请求耗时从3s降到了860ms

""

这些代码都是从业务中简化后的逻辑

真实的业务场景远比这要复杂:分段式数据存储、主从同步 读写分离、高并发同步策略等等

每一个模块都晦涩难懂

就好像每一个女神都高不可攀

屌丝战胜了所有bug,唯独战胜不了她的心

受伤了只能在深夜里独自买醉

但每当梦到女神打开我做的页面

被极致流畅的体验惊艳到

在精神高潮中享受灵魂升华

那一刻

我觉得我又行了

(完)

相关免费学习推荐:JavaScript(视频)

The above is the detailed content of Optimization tips! ! Front-end rookie speeds up interface by 60%. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:juejin.im. If there is any infringement, please contact admin@php.cn delete