Motion Design System at Swiggy

Motion Design System at Swiggy

Nov 22, 2022

Nov 22, 2022

The Backstory


By 2022, Swiggy's food delivery business scaled to more than 500 cities. The team had also grown. Different design and developer teams were working on different areas of the app. The Consumer App struggled to maintain consistency and UI interactions looked broken. There was no set handoff process and every motion design request became an ad-hoc effort.


During this time, we were revamping our entire consumer app. We started to document and systematise components in both design and code. This was our chance to include Motion to the core of the design system.





Starting With an Audit


We went through how motion was implemented across different areas of the consumer app. It helped us understand where the same pattern was repeatedly used, what should be added to the system, and which are just one-off uses.




Fundamentally, there were no guidelines for easing and duration. So, everyone was working with what looked good in a particular interaction. In a dynamic interaction, where a lot of elements were in motion, we didn’t do a good job in defining their choreography. In a lot of places, we weren’t leveraging motion as needed. While in others, it was probably too heavy.




Motion Design Principles


Before getting into the building blocks, we wanted to have some guidelines around when and how to use motion in the Swiggy ecosystem. Like any design system, the focus was on — consistency, time and effort saving, and UX enhancement.




Based on action-feedback, we also defined the types of interaction.

Real Time: Interactions provide immediate feedback to user input. The behaviour happens as the user is using it.

Non Real Time: Interactions occur after user input. Users briefly pause and watch the result before continuing. This is where motion design system comes in with properties like duration, easing, etc.

Non Interactive: Elements in motion are not dependent on user action. For example loaders, confetti, etc. These can be independent of the motion design system and exist as a Lottie library.





Defining the Building Blocks


Atoms are the basic building blocks of matter. In the natural world, atoms combine to form molecules, organisms, and so on. In design systems, we use this atomic design methodology to define the base tokens like color, text, etc.


Similarly, motion’s building blocks consist of — object transformations, duration, and easing. 



Object Transformations


Transformations or effects distinguish an object from being static. An object can undergo multiple transformations —  independently, simultaneously, or in sync with other objects. Each transformation has properties whose values change from start to end state.


Some of our core transformations included — Scale, Move, Fade, Rotate, Elevation and Morph.





Easing


In the physical world, objects don’t start or stop instantaneously. Instead, they take time to speed up and slow down. It’s what we call Easing. 


"Easing adds life to motion by adjusting the rate of change of an animation. It allows objects to speed up and slow down, rather than move at a constant rate."


Our approach to define easing tokens was mapping them back to the motion principles. When the user needs to focus on a task we don’t want them waiting — hence Effective Easing which is subtle and quick. Whereas Expressive Easing draws extra attention at the start and end to deliver an enthusiastic, longer, and highly visible movement.




For each easing type, we defined the start and end of a transition — using bezier curves. A general rule of thumb is to use: 


Ease-in :  Objects move out of the screen with increasing speed.
Ease-out : Objects emerge on the screen at full speed and slow down.
Ease-in-out : Objects move from one part of the screen to another.
Linear : Object changes its fill or opacity.





Duration


Duration is the time it takes for a transformation to complete. Ideally it should not be too fast to give users the possibility to notice the change, and at the same time not too slow to keep them waiting.


For mobile interfaces, a good rule of thumb is to keep the duration between 100 ms → 300 ms. However, duration depends on the the object’s size and the distance traveled.




Dynamic duration describes the relationship of an object’s size and distance weighted against time. To calculate the duration, we created a formula and a step-by-step approach that incorporated the distance traveled, size of the object, and easing.


Step 1


To account for distance and object size, we used the viewport area covered (or the area of the object). Then calculated duration using this formula:


t = ((a/A) * 200) + 100ms


Here A = total area of the viewport. This formula gives us an ideal range of transformations between 100 → 300 ms.



Step 2


Next, we tweaked the value of ‘t’ based on the type of easing. For Expressive Easing, we increased ‘t’ by 10%:


t = t + 0.1 * t



Step 3


For an Ease-in curve, the duration should be faster:


t = 0.8 * t


For a Linear curve, the duration should be faster:


t = 0.7 * t


For an Error curve, the duration should be faster:


t = 0.5 * t



The value of ‘t’ after all the steps is the duration for that transformation.




Communication and Documentation


For our interface components, we usually document the configs and specs before handing-off to developers. After building blocks were defined, we started adding how a component should behave during an interaction.




Not all objects were part of design system components yet. To show how an interaction would behave, we started writing specs using motion tokens. These were communicated async over Slack or Figma and discussed in handoff meetings if needed.




This helped us streamline the process to some extent. But it was still limited to the case of individual components or elements. For multiple objects, simultaneously in motion, we needed a different approach to systematise, document and communicate the interaction patterns.




Choreography


Choreography is about defining the logical order or relationship in which the different parts of an animation occur.


For example, in a simple FAB interaction, there are multiple elements involved — container, icon, menu content, and dark background — which have transforming properties like opacity, size, position, etc.




To effectively communicate a choreographed interaction, we used Sequencing Graphs. This was a visual way of showing the elements involved, their transforming properties, duration and easing of each transformation, offset and delay etc.



Incoming, Outgoing, Persistent Elements


In an interaction there might be objects going out of the viewport, objects coming in, or objects which start and end on viewport. As a general rule of thumb, we defined choreography for:


Outgoing  Objects: Move out first and have the shortest durations.
Incoming Objects :  Move in later and have slightly longer durations.
Persistent  Objects: Change for most of the interaction and have longest durations.




In above FAB interaction, the icon (outgoing element) fades out first; menu content (incoming element) fades in later; and the container (persistent element) transforms for most of the duration.



Parenting


In some cases, one object’s properties may be linked to another object. For example, consider this interaction on the Swiggy Money page where the card slides down on pull.




The position of persistent objects like offer carousel, transaction list and background image is linked to the sliding down of the card (parent element). The incoming elements like logo and illustrations fade in the same duration as the card slides down.




Bringing it all together


Once we defined the building blocks and choreography framework, we could start documenting our motion design components. For each component, we created a Notion document showing the interaction, sequencing graph, and text specs.




For non-interactive animation, we created a collection of Lottie files. These were animated json files independent of the motion design system.





The Impact


Adopting motion into the design system has had a positive impact on our product development process and user experience.

  1. UI interactions are now more consistent — which means a coherent experience and greater moments of delight for users.

  2. The design-development handoff has become much faster and more efficient. Before shipping any animation or interaction, we no longer need hours of pixel perfection to fix each nitty-gritty.

  3. With a set template for defining interaction specs, designers can easily refer and reuse the components for any new patterns.




What's Next?


Setting up a Motion Design System has been a challenging and fruitful journey for me. It helped me learn a lot of things like— working in systems, writing documentation, getting to know the development side, mastering the craft of motion design itself, etc.


At this point, we’ll continue measuring the impact of our system, get feedback from our internal teams, add new patterns, modify existing components and keep improving. 


In the future, we also plan to define and document the usage of:

  • Micro-interactions : Targeted feedbacks in response to a trigger.

  • Haptics :  Deliver feedback through the sense of touch.

  • Sound Design :  Communicates information, expresses emotion, educate users about interactions, etc.

Say Hello!

Say Hello!

Have an opportunity, wanna collaborate on something cool or just say hello!

Have an opportunity, wanna collaborate on something cool or just say hello!

milanmundra98@gmail.com