Get Mood Wise Tracks of Spotify in React within 8 hours using DhiWise

Saved hours
Saved money

Music streaming has evolved dramatically in reaction to the digital revolution over the last few years. People used MP3, Napster, iPods, and even illegal methods to download and listen to music what seems like a lifetime ago. Before that, there were CDs, cassettes, and vinyl records. Although having a record player at home and waiting up outside the store to acquire the latest music releases sounds delightful, people can't picture not having music at their fingertips - available whenever and wherever they want it!

The major focus is on the music streaming platform Spotify. The evolution of music streaming in itself is a topic that one could talk about for hours, but for this use case, we will be looking at the rise of the internet-based music streaming service.

Spotify provides software and app developers access to some of their data about users, playlists, tracks and artists through a Web API.

Spotify for Developers provides detailed and user-friendly documentation for their Web API.


A random thought could strike you: why DhiWise?

DhiWise is a programming platform where developers can convert their designs into developer-friendly code for mobile and web apps. DhiWise automates the application development lifecycle and instantly generates readable, modular, and reusable code.

This application saves time and money by utilising DhiWise functionalities such as API integration, Navigation, Constant, and so on.

The Challenge

A production-ready application’s must-haves are:

  • Attractive UI and UX
  • Code Quality
  • Accurate live information
  • Seamless Navigations
  • Exception Handling to avoid App-crashes

This application's feature includes Top User Tracks by six to one month, as well as the ability to select any emoji and give a song from Spotify based on that emoji.

The Solution

🚀 Mission Launch

For React UI design, custom Figma design is used, and design creation was based on DhiWise guidelines.

Application Tech Stack


Spotify web APIs.

Lifecycle of a Mission (Implementation)

🔍 Discovery

Emojis are used to convey emotional reactions. This achievement will become useful while choosing songs/tracks.

44% of Spotify's monthly active users use the platform on a daily basis. That equates to more than 104 million active daily users. Usually, people listen to music while functioning to be more productive, and using emojis if you can acquire tracks would be the icing on the cake.

Mood Wise tracks also show users’ top 5 songs from the last six to one month, allowing them to play their most frequently played tune.

It allows you to select emojis and gives you a unique song based on your selection using Recommendations generated  API along with more than 10 popularity scores (only those tracks with a tempo of greater than 10), it starts subsequent songs from the chorus to keep the music flowing.

💭 Planning and Design

Figma has been created according to DhiWise guidelines; with maximum validation.

Spotify APIs are open APIs.

To access Spotify APIs, a new App was created in the My Applications section of Spotify Developer. The client_id and client_secret were copied to an environment variable.For token creation, the Authorization Code Flow was used to obtain the Refresh Token.

Open Postman, under a new request, in the Authorization tab, select OAuth 2.0 to get an access token and refresh token.

The Token was generated by joining a client id and client secret. An access token is generated for 1 hour and then needs to be refreshed.

The API integration with DhiWise must be done using Spotify Recommendations generated  API and User’s Top Tracks, in order to find music by emotion, several filters must be used, such as limit, seed genres, market, and minimum popularity.

🛠️ Development

  • After Figma was synced successfully, while creating an application, the UI quality was tested by building an app before integrating it for which the Vercel deployment feature was used

  • Once ensured that the components had been detected correctly.Here, using the above-mentioned Figma, needed to change the row to list, so that data binding of API can be done.

  • The APIs were added one after another by using DhiWise's API integration feature and were tested with the API runner feature, which enables API execution.After entering APIs, the list was displayed in the panel.

  • The most important step was to integrate the API into the screen; to do so, went to the screen list and clicked on configure; it displays a page where to detect the components. So, here, integrated the token API on life cycle actions, so that when the page loads, the API will be triggered.

  • The most important step was to integrate the API into the screen. And to do so, on the screen list, configure button was clicked which displayed a page where to detect the components. Here, token API on life cycle action was integrated so that when it triggered, the page loaded.

  • The top tracks API were connected and binding the response to the view was then a breeze.

  • Array values for artist names were manually set on the list of the top tracks, which took 5 minutes of coding.

  • The navigation upon clicking the photos of the song was set using the navigation option and a URL was passed for the time being. Later, the URL would be given dynamically by doing manual coding.

  • To customise the music based on moods, which requires API integration relying on emoji clicks was the only way

  • Then upon right-clicking an emoji, API integration was done and then the integration process was done with a binding response to view and handling an error and success both.

  • Here, in the emoji sidebar, the data was to be fetched dynamically in order to, generate a constant file and include the JSON of mood and image link.

Then, manually the data constant was set and array objects were utilized to dynamically map the data.

The array was mapped to the sidebar code later.

At last, the App was built and the code was synced to the GitHub repository.

After using DhiWise, manual coding was required, which took 4 hours.

Firstly, the API headers were set with,

Here, added "Bearer “ only, as API required it.

The next stage was to dynamically set the emoji sidebar.

Thereafter, the main focus was to offer tracks by mood for that API, as well as how to set the response from an array and implement custom logic for that.

For that update, the inaccurate CSS was achievable with a little understanding of Tailwind CSS.

🚛 Delivery

The whole app will help to listen to the tracks by your mood and now it is open for the World to use.

It is deployed on Vercel and you can access the preview here: https://mood-wise-tracks.vercel.app/