注册

其余文章排版

期刊/内刊其他2024-03-04
31

我的同事蒋雅梅

如果每个人都注重本职工作的质量和效率,在固有的工作上进行创新和优化,拥有为团队共同目标协作的精神,我们一定所向披靡!
蒋雅梅是中国区华东分公司的同事,主要负责区域财经运营工作,大家亲切的叫她雅梅。
财经运营的工作非常琐碎,同时经常伴随一些临时性的任务。对从岗人员的主动性、准确性、计划性、时间和风险管理能力有较高要求。好的运营是项目组的灵魂,能够让交付和销售有更多的时间在客户端,无后顾之忧。从备货、发货、到货、验收、对账、开票、回款实现闭环管理。发现异常主动处理并进行风险预警,既能在内部推动流程进度还能触达客户端,实现客户互动和关键商务结点推进。雅梅就是这样的运营同事。
和雅梅共过事的同事都有一个感受:专业、高效、清晰、超出期望。她组织的区域commit会,一定是在30分钟内开完的。这得力于她充分的准备和对每一笔订单状态的了解。所以在会上,销售只对机会和策略进行说明,其

它问题她都能解答。而且她会在会前针对不同项目进度、问题、异常与销售、交付甚至客户进行核对。因为雅梅的充分准备,节约了大家的时间,提升整个组织的效率。
分享几个与雅梅日常工作上的小事情:
1.及时同步物流信息:客户订单发货,正常收到订单之后,铁三角成员可以根据自己需要,两天后在系统上查单号。她负责的的发货订单会催流程并在第一时间同步客户进度。比如周二下午收到订单,周三早上客户就收到了同步发货单号的邮件。不但提高了工作效率,也减少了整个团队的重复工作量,每个人各司其职,把控各个环节的工作,让客户第一时间了解订单进度,提升客户满意度。
2、合理优化业务流程:不同的用章流程有着长短之分,比如业务用章可能仅需要6-8个审批环节,而合同用章较为复杂,需要10个以上的领导复核审查。这些流程都是为了更好地满足公司标准化和规范化,但也同样会花费一定的流程审批时间。她在每次提起流程之前,会和项目组同事核实项目背景和细节,确保在满足审批要求的基础上,选择最合理、最高效的流程。减少了双方等待时间,提高了工作效率,在客户满意度上起到很好的帮助,对后续合作也起到了锦上添花的效果。
3、黑红配合:针对个别比较难沟通的客户,销售一般不会直接将问题抛给客户,这时候就会事先与她沟通好,一个唱白脸另一个唱红脸去和客户沟通,既照顾了客户的情绪,又可以将工作顺利的往前推进。
我也和雅梅探讨过,是如何做到准确高效的,希望她能进行归纳总结在团队内推广。除了熟练掌握工具外,她分享以下三点:
1、当日事当日毕,能提前做的就提前做,不能拖拉,不然越积累越多。
2、尽可能多的与销售、项目经理、客户保持沟通,实时了解项目进度,将有可能存在风险的地方进行有效提醒。
3、持续归纳总结,遇到问题总结问题,一定要找到问题点进行修正,避免重复发生。
这就是雅梅的日常工作,认真仔细、勇于学习、为业务发展打下坚实基础,默默耕耘付出,为汉朔发展贡献力量,真正做好了为公司业务发展保驾护航的角色,得到同事和客户的高度认可。

32

汉Show

雅梅日常工作表现,真正体现了以客户为中心的持续创新、以奋斗者为本以结果为导向。如果大家都能注重提升本职工作的质量和效率,在固有的工作上进行创新和优化,拥有为团队共同目标协作的精神,我们一定所向披靡!

智慧零售事业部 陈杰

位置很重要吗?
Transformer为了充分解决自注意力的缺陷,引入了位置嵌入(Position Embedding)来有效利用位置信息。
我们继续吃草莓:“我感觉很幸福,吃了草莓,味道很甜。”这句话和之前那句话很像,但却有因果关系上的不同。在不同的语言中,单词的位置对句子的表意也有很大的影响。例如在日语中,日语利用助词在为每一个单词添加结构位置信息,让日语即使顺序颠倒,也可以表意明确。而在中文中,顺序却撑起了整个句子的结构,所以位置很重要。
Transformer就利用位置嵌入,为所有的单词添加了一个前缀,然后告诉自注意力模块:“我给你的每个单词都是有顺序的,你在建立联系的时候,把位置也考虑进去!”
Transformer能做什么?
表面上,Transformer是用于翻译,但实际上,只要合理修改编码器与解码器的组成,然后再添加一些功能性组件后,Transformer几乎无所不能。例如,将Transformer的训练数据稍作修改,即可变成一个简单的问答系统,如ChatGPT就是将问答系统发挥到极致的模型。舍弃掉文字,将像素块作为输入,它就变成了Vision Transformer,可以用于图像任务(分类,检测,超分辨率等)。如果把文字与图像形式统一,将其共同输入和输出,那它就可以完成多模态任务。Transformer的应用也不断改变着人们的生活,各式各样的应用层出不穷。
结语:在这五年里,Transformer在各个领域遍地开花,仿佛是一场深度学习领域的科技革命。Transformer的成功经验也告诉我们,一点点的创新就可以改变整个行业。因此,身为科研人,我们应该注重创造有效创新,敢于变革。但创新并不是每一次都会成功,这也需要我们保持着一颗持之以恒的心,将不断的试错转化为具有价值输出的创新。
本文主要从原理和应用上对Transformer进行了阐释,有兴趣的同学也可以通过引文了解Transformer的内部构造以及其更加细节的实现。

本文通过通俗易懂的方式,向大家介绍Transformer的基本原理
Transformer作为深度学习界的网红,从2017年的《Attention is All You Need》开始,到今年年末的ChatGPT,被应用到各个领域中。本文将以最通俗易懂的方式,来介绍Transformer的基本原理,没有一条公式,味道甜美,请放心食用。
Transformer的前世今生
最初的Transformer,主要用于机器翻译的任务。它的主要架构也比较简单,即:“编码器与解码器的叠加”。那么什么是编码器,什么又是解码器呢?
这里举一个简单的例子:假如你看到一个中文词“草莓”,你的脑中可能会浮现出草莓的样子、草莓香甜的感觉,之后脑内又出现“士多啤梨”(这是草莓英文的音译)这个奇怪的名字,之后“strawberry”这个词脱口而出。
这是人类的一个翻译的过程,也与Transformer的编码器-解码器架构有着异曲同工之妙。首先,从文字“草莓”到草莓的样子,草莓的味道,这是一个编码的过程;而从草莓的味道到“士多啤梨”这个读音,再到“strawberry”这个单词,就是解码的过程。
这样看起来,Transformer是不是很简单?那么有一定的深度学习知识的小伙伴就要问了,在Transformer出现之前,就有无数的模型都是基于这种架构,那Transformer与之区别在哪里?就像《Attention is All You Need》标题中这样,Attention才是Transformer的核心。
什么是注意力?
在Transformer的编码器或解码器的内部,多头自注意力(Multi-head Self-attention)是其组件的核心。自注意力就如同人类的注意力一样,会把重点放在关注的事情上,进而忽略无关的事务。在Transformer中,就是把句子中单词与单词两两之间建立联系,同时这个联系还是多个维度的。
继续举草莓的例子,有这样一句话:“我吃了草莓,味道很甜,感觉很幸福”。这句话对于人类来讲意义很明确,但对于机器,就稍微有点意义不明,机器会感到迷惑:“甜的是哪个?幸福的又是哪个?“而作为人类,我会毫不犹豫告诉机器:”甜的是草莓!幸福的是我!”这就是作为人类的注意力。
在没有注意力的翻译模型中,可能只会以就近原则为不同的单词建立联系。能够建立从“我”到“幸福”这种跨越式的联系才是自注意力的精髓。而Transformer也是靠这种自注意力的方式来获得更好的翻译表现。
但是,自注意力也有它的缺陷,就是它在对词与词建立联系的时候,没有有效利用单词在句子中的位置信息,每个单词都是没有位置信息孤立存在的。

为什么选择
Transformer?

11

34

汉Show

汉Show

35

智慧零售事业部 景彦豪

My Colleague  Jiang Yamei

Jiang Yamei is a colleague of the East China Branch in China Region, mainly responsible for regional financial operation. We affectionately call her Yamei.
As a trivial job accompanied by a number of temporary tasks, financial operation poses high requirements on the initiatives, data accuracy, planning, scheduling and risk management ability of the staff on active duty. A qualified operator is the soul of the project team, who enables the delivery and sales personnel spend more time in the client and solving future worries, and contributes to closed-loop management from stock preparation, shipment, arrival, acceptance, reconciliation, billing and payment collection. In case of anomalies, the operator is supposed to take the initiative to deal with them and conduct risk early warning, which can not only promote the process progress internally but also contact the client. In this way, customer interaction and key business nodes are promoted. Yamei is such an operator.
Colleagues who have worked with Yamei all feel that, she is professional and efficient, has a clear mind and is beyond expectations. The regional commit meeting she organized must have been over in 30 minutes. This is thanks to her thorough preparation and knowledge of the status of each order. So at the meeting, the salespersononly explained the opportunity and strategy, then she could answer all the other questions. In addition, she will check with sales and delivery personnel and even customers for different project progress, problems and exceptions before the meeting. Because of Yamei's full preparation, such meeting saves everyone's time and improves the efficiency of the whole organization.
Several simple things related to Yamei's daily work:
1. Prompt synchronization of logistics information: For the delivery of customer orders, after receiving the order, team members can check the courier number on the system two days later according to their own needs. For the delivery orders she is responsible for, Yamei will urge to accelerate the process and synchronize the progress of customers in the first time. For example, if an order is received on Tuesday afternoon, the customer will receive an email on synchronized shipment tracking number onWednesday morning. This not only improves productivity,but also reduces the amount of repetitive work for the whole team. As everyone performs their own duties, they control the work of each link, so that customers can get in touch with the order progress in the first time and feel more satisfied.

2. Reasonable optimization of business process: Seal procedures vary with specific business. For example, the business seal requires 6-8 approval links, while the contract seal requires review and approval of more than 10 leaders due to its complexity. These procedures are designed to better meet the company' s standardization at the cost of a certain amount of approval time. Before initiating the procedures each time, Yamei would verify the background and details of the project with her project team members, so as to select the most reasonable and efficient procedures while meeting the approval requirements. This reduces the waiting time of both parties, improves work efficiency, promotes customer satisfaction, and also plays a beneficial effect for the follow-up cooperation.
3. Good Cop, Bad Cop Negotiation Strategy: For some tough customers, instead of making a clean breast of the whole issues, our sales personnel will discuss with Yamei in advance and communicate with customers with a Good Cop, Bad Cop Negotiation Strategy. This helps to protect the customer's feelings, and move the work forward smoothly.
I have also discussed with Yamei on her accuracy and efficiency style. I hope she can summarize and promote her experience in the team. In addition to her mastery of tools, she shares her experience as follows:
1. Never put off until tomorrow what you can do today. What you can do ahead of time can be completed ahead of time if possible. Otherwise, more and more work will be accumulated.
2. Keep in touch with sales personnel, project managers and customers as much as possible, track the project progress in real time, and provide effective reminders of possible risks.
3. Summarize the problem continuously, locate and rectify the problems found and never repeat the same mistake.
This is the daily work of Yamei. She is careful, willing to learn, and lays a solid foundation for the business development. She makes silent contributions to the development of Hanshow, and serves as a true escort for the business development, and is highly recognized by colleagues and customers.

32

汉Show

Yamei's daily work performance truly reflects the customer-centered continuous innovation, and is striver-oriented and result-driven. If everyone can improve the quality and efficiency of the work, innovate and optimize the inherent work, and bear the spirit of teamwork in mind, we will be invincible!

Smart Retail Division Chen Jie

However, self-attention also has defects. It fails to make effective use of the position information of words in the sentence when establishing the connection between words. Therefore, every word exists in isolation without the position information.
Does the position matter?
To fully solve the defect of self-attention, Transformer introduces Position Embedding to effectively utilize the position information.
Another example of strawberry: "I feel happy. I ate some strawberries. They taste sweet" This sentence is very similar to the previous one, but they differ in the causal relation. In different languages, the position of words also has a great influence on the meaning of sentences. For example, in Japanese, auxiliary words are used to express structural position information, so that the meaning can be expressed clearly even if in a reversed order. In Chinese, however, the order supports the whole sentence structure, so the position is vital.
Transformer takes advantage of Position Embedding to add a prefix to all words, announcing to the self-attention module: "Every word offered is in order, so take the position into account when making connections! "
Functions of Transformer
On the surface, Transformer is used for translation. In reality, however, with reasonable changes to the composition of encoder and decoder, Transformer can do almost anything in conjunction with some additional functional components.
For example, Transformer training data can be slightly modified to turn into a simple Q&A system, such as the ultimate Q&A system model ChatGPT. The Transformer without a text module and with pixels as input will turn into a Vision Transformer that can be used for image tasks (classification, detection, super-resolution, etc.). If texts and images under a unified form are input and output together, the Transformer can complete multimodal tasks.With various applications coming out in an endless stream, Transformer is changing our lives.
Conclusion
In the past five years, Transformer has reaped rich fruits in every field, and initiated a deep learning revolution. Transformer's success tells us that a little innovation can change an entire industry. Therefore, as researchers, we should focus on effective innovation and dare to change. However, innovation is not always successful, which also requires us to maintain a persistent heart, and transform trials and errors into innovation with value output.
This article mainly explains Transformer in terms of the principle and application. If you are interested in this regard, you can also learn about the internal architecture of Transformer and details on its implementation through the following quotations.

Transformer has been an Internet sensation in the deep learning world since its appearance in the paper Attention is All You Need in 2017, and has been widely applied in various sectors such as the ChatGPT at the end of this year. This article will introduce the basics of Transformer in the most accessible way possible. Please feel free to read it as there is no formula in it.
History of Transformer
The original Transformer is mainly used for machine translation tasks. Its main architecture is also relatively simple, that is, "a superposition of encoder and decoder". So what is an encoder and what is a decoder?
Here is a plain example: seeing the Chinese word "strawberry", you may think of the appearance and sweetness of a strawberry, and then the strange name of "Shiduopili" (which is the transliteration of strawberry in Chinese) will come to mind, and the word “strawberry” will escape your lips.
This is our translation process, which is similar to Transformer's encoder-decoder architecture. Specifically speaking, you think of the appearance and taste of strawberry upon seeing a word "strawberry", this is an encoding process; while the taste of strawberry reminds you of the pronunciation "Shiduopili" and the word "strawberry", this is a decoding process.
Does Transformer seem so simple in this regard? Those with some knowledge of deep learning would ask, there were countless models based on this architecture before Transformer, so what is the difference between them? As the title Attention is All You Need suggests, Attention is at the heart of Transformer.
What is Attention?
Multi-head Self-attention is at the heart of the components of a Transformer encoder or decoder. Self-attention, like human attention, focuses on what is important and ignores what is irrelevant. The Transformer operates in such a way that a multi-dimensional connection has been made between pairs of words in a sentence.
Another example of the strawberry. There is a saying: "I ate some strawberries. Taste sweet. Feel happy." This sentence makes sense to a human. However, a machine may get confused in this regard: "Which one is sweet? Who is happy?" As a human, I wouldn't hesitate to tell themachine: "Strawberries are sweet! I am happy!" This is the attention a human is supposed to pay.
In a translation model without attention, connections may be established for different words only on the basis of the proximity principle. The essence of self-attention lies in the leapfrog connection from "I" to "Happy". Transformer also relies on this self-attention approach for better translation performance.

Why Choose 
Transformer?

11

34

汉Show

汉Show

35

Smart Retail Division  Jing Yanhao

Copyright © 2024 陕西妙网网络科技有限责任公司 All Rights Reserved

增值电信业务经营许可证:陕B2-20210327 | 陕ICP备13005001号 陕公网安备 61102302611033号