Judelyn Gomes
#Business | 6 Min Read
Data warehouse (DW) modernization has become extremely vital for modern businesses. It ensures timely access to the analytics and data needed for businesses to operate smoothly. To facilitate smart decision making for practitioners, especially in the manufacturing industry, data warehousing for OLAP (Online Analytical Processing) applications are used to provide a distinctive edge.
The traditional DW systems are often unable to cope up with the requirements of contemporary industries, resulting in several pain points. DW modernization hence is needed to effectively and swiftly meet the ever-evolving business environments, and rapidly iterate cutting-edge solutions, to provide adequate support for new data sources. The manufacturing industry especially involves a host of complex processes and huge investments, and to ensure the best possible outcomes, it is necessary to modernize the data warehousing system.
Modern DW featuring cloud-built data architecture helps companies support their current and future data analytics workloads, irrespective of the scale. The high flexibility offered by popular cloud platforms like AWS, empowers businesses to carry out various processes with their data in real-time, and comes as a boon for many modern sectors, including manufacturing.
Data warehousing and the manufacturing
Over the years, several companies belonging to the manufacturing industry have started to use DW to improve their operations and deliver enterprise-level quality. Data and analytics allow manufacturing firms to stay competitive and cater to the current market needs. Reports, dashboards, and analytics tools are used by manufacturing professionals to extract insights from data, monitor operations and performance, and facilitate business decisions. These reports and tools are powered by DWs that efficiently store data to reduce the input and output (I/O) of data and deliver query results in a swift manner.
While transactional systems are used to gather and store detailed data in manufacturing firms, DWs caters to their analytics and decision-making requirements. Modern manufacturing companies ideally have systems in place to control individual machines and automate production. Online Transaction Processing (OLTP) is optimized in order to facilitate swift data collection with feedback required for direct machine control.
For real-time feedback to be meaningful, it is imperative to have historical context. An advanced data warehouse would be required to provide this context. Real-time systems can use this historical data from the DW, along with current process measurements, to offer feedback that facilitates real-time decision support.
Pain points of legacy data
Data warehouse modernization is required to address a host of organizational pain points. Business agility tends to be among the prime goals of modern age organizations as they move towards digitizing their operations which is extremely hard to achieve with legacy tools. The key aim of modernizing the data warehouse environment of firms would be to improve their analytics functions, productivity, speed, scale, as well as, economics.
Below are some of the challenges with legacy data warehouses,

  • Advanced analytics: Many organizations today are investing in online analytical processing or OLAP. However, with the legacy data warehousing environments, they are often unable to effectively use the advanced analytics forms, find new customer segments, efficiently leverage big data, and stay competitive.
  • Management: Legacy infrastructure is quite complex, and hence companies often have to keep investing in hiring professionals in order to effectively manage these outdated systems, even if they are not advancing agility or data strategy. This may incur high unnecessary expenses for a business, which can be cut down using a modernized system.
  • Support: There are many newer data sources that are not supported by the typical legacy data warehouses. These traditional solutions were not designed to handle the varying types of structured and unstructured data prevalent today, and hence can pose problems for many businesses. These issues can be solved by moving to more modernized technologies.
How can data warehouse modernization and AWS help?
DW modernization enables companies to be limitless when it comes to storing or managing data. They can scale gigabytes, terabytes, petabytes or even exabytes of data through these technologies. Through it, companies can easily scale their SQL analytics solutions.
Advanced data warehouses would enable companies to leverage the best possible practices to develop competitive business intelligence projects. DW modernization is especially useful in manufacturing and hi-tech industries, where broad diversity of data types and processing tends to be needed for the full range of reporting and analytics.
AWS and its APN Data and Analytics Competency Partners are renowned for offering a robust range of services for the implementation of the whole data warehousing workflow, including data warehousing analytics, data lake storage, and visualization of results. The data warehouse modernization platform of AWS facilitates faster insights at low expenses, ease of use with automated administration, high flexibility in analyzing both open and in-place formats, as well as, compliance and security for mission critical workloads.
AWS DW would help in:

  • Swiftly analyzing petabytes of data and providing superior cost efficiency, scalability, and performance.
  • Storing all data in an open format without having to move or transform it.
  • Running mission-critical workloads even in highly regulated industries like manufacturing.
Data comes from a host of sources in contemporary data infrastructures. It includes sensor and machine-generated data, CRM and ERP, web logs, and numerous other types of industry-specific data sources. The majority of companies, including the manufacturing industry, face a lot of difficulties in just storing and managing these increasing volumes and formats of data, let alone carrying out analytics to identify patterns and trends on it. To solve this issue, data warehouse modernization has become a necessity. It allows businesses to swiftly meet the rapidly changing business requirements, provide the needed support for new data sources and promptly iterates new solutions. There are many platforms and tools available today that can help businesses to modernize their data warehouse, AWS solutions being among the most competent ones.
References
https://support.sas.com/resources/papers/proceedings/proceedings/sugi24/Emerging/p142-24.pdf
https://info.fnts.com/blog/5-common-business-challenges-of-legacy-technology
https://cloud.google.com/blog/products/data-analytics/5-reasons-your-legacy-data-warehouse-wont-cut-it
https://tdwi.org/articles/2014/05/20/data-warehouse-modernization.aspx
https://www.vertica.com/solution/data-warehouse-modernization/
https://aws.amazon.com/partners/featured/data-and-analytics/data-warehouse-modernization/#:~:text=AWS%20provides%20a%20platform%20for,compliance%20for%20mission%20critical%20workloads
Pooja Joshi
#Business | 6 Min Read
According to research by the United Nations, the global population is expected to be 9.5 million by 2050, approximately a 2.2 billion increase from now. This implies that the demand for food will increase, and thus, crop production will also grow. This isn’t as simple as it sounds, because there are several hurdles that hamper agricultural supply, from food security to climate change. To overcome these challenges, we are seeing a shift towards technology and smart farming in the agricultural sector. Farming using diverse information and communication technologies is a new concept and the sector has shown immense benefits. Lately, leading agricultural ventures have been mobilizing the potential of cloud technologies to solve an array of problems related to farming. For instance, John Deere, a renowned farm equipment manufacturer, came up with cloud-based software Operations Center. This software successfully keeps track of a farm vehicle’s performance for quick and effective troubleshooting. In India, on the other hand, as reported by NASSCOM in 2019, there are more than 450 agri-tech start-ups, and they are growing at a rate of 25% annually. This sector’s potential is evident from the fact that it has received more than $248 million in funding.
Countering the pressures faced by the agricultural sector has become comparatively more straightforward with modern technologies like artificial intelligence, remote sensing, data analytics, GIS, blockchain, various Internet of Things (IoT) devices, and much more. These ensure more effective, productive, and prosperous farming practices. But among these, a data-enabled business model has been the most profitable as the collection of real-time data helps predict the prospects of farming practices. Reports suggest that the advocacy of data and analytics in agriculture has been growing consistently; the market size of global agriculture analytics is likely to increase from USD 585 million in 2018 to USD 1,236 million by 2023, at a Compound Annual Growth Rate (CAGR) of 16.2% during the forecast period. Large amounts of data are collected and integrated by experts to put forward alerts and solve complicated problems related to soil quality or other operational incompetencies.
How is data derived in smart agriculture?
IoT
The Internet of Things has positively impacted many industries, and agriculture is undoubtedly one of them. IoT helps fight several farming challenges, be it weather conditions, climate change, or the quality of seeds and pesticides. Sensors were introduced a few decades ago, but they were handled conventionally. With the introduction of IoT in agriculture, technologically advanced sensors are used that derive live data. For example, remote sensing assists in tracking the condition of crops in a field regularly to recognize any possible risk and take necessary precaution accurately. Cloud-based data storage with IoT platforms plays a vital role in smart agriculture. Whether farmers want to know about crops’ live status or the weather, real-time data analysis is quite impactful. Few recent use-cases show how IoT and big data in farming have been successful, like Vodafone’s Precision Farming solution that lets farmers use only the amount of fertilizer needed or Digital Transmission Network (DTN). Using DTN, farmers can examine updated weather data to manage their farmlands better.
Space monitoring
Turning to technology has been one of the most sought-after ways to maintain smooth global food supply, and space monitoring is an integral part of smart farming. Be it frequent droughts or locust outbreaks, space monitoring through geospatial tools is a boon to agriculture. The remote sensing satellite imagery offers critical data for monitoring crops, soil development, and other climatic conditions. For example, climate, soil, and other assessments from satellites assist farmers in planning the required time and amount of irrigation needed for their crops. In this way, the adverse effects of food shortages and famines are tackled.
Use of big data in farming
While smart farming and big data are opening new gates for profitable agricultural yield, the collection of real-time data to forecast various situations is something that has improved farming practices.
Here’s how big data is being used in recent times.

  • Prognosis of yield
    Prediction of yield is one of the most meaningful uses of data in farming, as it helps farmers evaluate specific questions like what to plant, where, and when to plant crops. This is majorly done with specific mathematical and machine learning models and sensors that examine data around yield, weather status, biomass index, and much more. For farmers, the decision-making process becomes smooth and convenient, with an improved approach towards crop production.
  • Effective management of risk
    The chances of crop failure can quickly be evaluated with big data, and hence, farmers find it quite useful. Daily satellite images are combined with relevant data to identify weather scenarios like wind speed, humidity level, rainfall, and much more. Back in 2014, an integral suggestion from data scientists to Colombian rice farmers had reportedly saved millions of dollars in damages due to changing weather patterns. Damages due to changing weather conditions or other reasons can be evaded with data science.
  • Improvement of farm equipment
    Equipment manufacturing companies have integrated sensors in farming vehicles and farm equipment. This deployment of big data applications help farmers monitor the long-term health of farm equipment and also lets users know about the availability of tractors, due dates or servicing, etc.. Recently, scientists and researchers from Connecticut University have brought forward some in-field soil moisture sensors that reduce excessive water consumption during farming by at least 40%.
  • Bridging the gap between supply and demand
    Increasing supply chain transparency is one of the crucial benefits that big data offers. One of the main struggles in the food market is to bridge the inevitable gap between supply and demand. According to a report by Mckinsey, at least one-third of food produced in a year for consumption is wasted. With real-time data analysis, several forecasts have become more accurate, and integrated planning is now possible. This also helps to track and optimize routes of delivery trucks.
The cloud-based ecosystem through IoT and big data is gearing up to revolutionize agriculture as space monitoring, or cloud-based apps are helping farmers adjust production in tune with market demand. Today, the scope of big data in agri-tech is exceptionally significant. Lloyd Marino of Avetta Global, an eminent big-data expert, has pointed out, “Big data in conjunction with the Internet of Things can revolutionize farming, reduce scarcity and increase our nation’s food supply dramatically; we just have to institute policies that support farming modernization.” Working together to enhance the use of smart farming and the application of data in agriculture would strengthen agri-tech solutions for a better future. This will not only increase the production efficiency of crops but also mitigate the problems of higher demand for food and supply shortage.
References
https://www.un.org/development/desa/en/news/population/world-population-prospects-2019.html
https://nasscom.in/sites/default/files/media_pdf/NASSCOM_Press_Release_Agritech_Report_2019.pdf
https://www.talend.com/resources/big-data-agriculture/
https://www.researchgate.net/publication/279497092_303_Performance_of_a_New_Low-cost_Soil_Moisture_Temperature_and_Electrical_Conductivity_Sensor
https://www.mckinsey.com/business-functions/sustainability/our-insights/feeding-the-world-sustainably
https://www.forbes.com/sites/timsparapani/2017/03/23/how-big-data-and-tech-will-improve-agriculture-from-farm-to-table/?sh=a325bec59891
Pavan Thallapally
#Business | 7 Min Read
Since its inception, the human era has evolved through cognitive, scientific, and industrial revolutions, and experts around the globe have designated this era as the Information age. With these natural selections, we designed principles of life that are replaced by the digital world. In these physical and virtual environments, humans have started thinking smartly to increase experience, improve productivity, and reach out to the world as identity. This will help humanity to make better decisions and future actions for sustainable movement in human destiny.
These digital products enable food, communication, travel, health, and many global society things. Tangibly, human beings are adapting to these products; however, some problems need to be solved for the humans who are impaired permanently, temporarily, and situationally. Solving these accessibility problems discards social boundaries, improves natural ways of interactions, reduces errors, enables seamless transitions, and market reach out.
Designers design a product by empathizing with the target personas, who are represented by their mental model, and then being inspired for choices and decisions to add emotions, navigations, and solutions. But to diversify the product, it needs to embrace all the aesthetics to interact with the product. Below are the checklists and common mistakes committed while designing.
Quick checklists to consider
Textual Representation
  1. Avoid complex vocabulary and describe it with fewer words.
  2. Guide the content with visual hints like icons and images.
  3. Design with simple hierarchy and navigations.
  4. The information should render across all devices, different resolutions, and without overlapping the text.
  5. Develop the text and elements larger by default so that it renders across all types of devices.
Hierarchy
  1. Headings drive navigation of users from one section to another, which also helps the assisting technologies.
  2. Listing the order or groups to describe the collective information.
  3. Notifying warnings, errors, and success statuses for every corresponding input label.
  4. Helping with subtitles and voice over and taking voice commands from the users when engaging with audio, video and other media-related content.
Color Contrast
  1. Keeping interactive elements with better contrast ratios.
  2. Always combine colors with visual cues for better interactions.
  3. All the textures, patterns, and paradigms need to be described with actions and content.
  4. Ensure all the elements will receive and understand equal amounts of information.
  5. Use plugins to your design tools to simulate the color principles like stark etc.
Input Devices
  1. All the elements and content must be operable by keyword, pointing devices, much more.
  2. Make sure the focus of the element is applied on the screen for every navigation.
  3. Adding the visual layout to see the keyboard focus.
  4. All the images, icons, graphs, charts should have an alternative text to understand.
Access to assistive technologies
  1. Allow assistive technologies to read and take the inputs.
  2. Enable the product for various assistive technologies.
  3. Testing all the user journeys with real users for better user experience and usability.
Social Privacy
  1. Awareness of privacy and data access.
  2. Visibility in storing and deleting their data.
  3. Product availability in offline mode.
“ Addressing accessibility in the design system is the right place to start your inclusivity efforts because it’s the foundation of your product.”
Common mistakes committed while designing
Typography
Humans with visual impairments can find certain letters and styles confusing. Therefore, potentially confusing letter heights, weights, lengths, and sizes must be clearly defined.
  • The line-height Web Content Accessibility Guidelines recommend a value of 1.5 for body copy. Evaluate, reduce, or increase as necessary.
  • Scanning long lines of text is testing for your eyes. Research indicates that the average online line length is around 70 – 80 characters. Limit lines to no more than 16 words.
  • Centered text is not accessible text. The act of centering creates different starting positions. This creates issues for the visually impaired.
  • Never use ‘ALL-CAPS’ in body copy settings. Use Small Caps if short-length capitalization is required. They are great for emphasis, abbreviations, and subheaders
Contrast
There are a lot of things to consider when making your content accessible from a color and contrast perspective, including:
  • Be careful with light shades of color, especially grays–they are difficult to see for people with low vision.
  • Ensure the icon fits into equal sizes. If some have circles in it, make sure these circles have the same diameter. Icons should have a consistent style.
  • Do not rely on color alone to convey info to your users. For example, make sure your links have underlines or some other visual indicator besides color.
Layouts
Remember that a designer and an artist are different professions. In design, we create a product for people, which means your creative impulses can be applied only to not interfere with the user experience.
  • Avoid experimental positioning of elements on a screen/page/card without good reason. Otherwise, the user may get confused and leave your site or delete the application.
  • While considering the structure and layout, we should also consider using a mouse, keyboard, touchscreen screen, or another adaptive technology device. Once this skeleton structure is ready, the styling of each sentence and paragraph comes into the picture.
  • It is essential to check with the developers about screen sizes; a website design may be challenging since it will be used on a wide range of devices. Because front-end developers usually don’t have a design background and will implement the design exactly how it was provided to them.
  • Use provided heading styles in correct order to create structure. Avoid formatting headings manually to be large and bold.
Navigation
Without a doubt, the most stalling aspect of any user interface is usually the navigation. So, dedicating extra attention to this area will inclusively improve the experience for all types of users. Bear in mind the following tips to reduce confusion:
  • When linking, use anchor text that sets realistic expectations.
  • Maintain anchor text consistency when two links lead to the same destination.
  • Implement breadcrumbs to convey where the user stands in an event sequence.
  • Highlight the current keyboard focus (for input fields, a blinking cursor isn’t enough).
“Accessibility is solved at the design stage.”
Why designers should consider accessibility while making design decisions.
User actions have a vital role in the application. So, for every action that the user takes is preceded by a designer’s decision and plan. It helps users complete their tasks with a better experience and navigation.
  1. While making decisions, you’re likely to discover and correct usability problems that affect the users so that you are also solving users with the age-related accessibility needs that are rapidly growing customer segments.
  2. Businesses want to avoid claims of discrimination and legal action when they launch the product in the market. So, designers will help them by implementing accessibility standards.
  3. Before it goes to development, design decisions while building the prototype model and usability save development time and stakeholders investments.
  4. One more advantage for developers and designers that websites are created with accessibility in mind is a higher-quality code base. For example, accessibility testing tools such as the a11y® testing platform can also identify errors that make general usability problems.
Conclusion
The accessibility principles benefit humans with disabilities, including expanding your customer base, polishing your brand image, increasing your search engine rankings, and making general improvements to usability.
References
https://rangle.io/blog/can-a-design-system-be-accessiblee
https://www.granite5.com/
https://www.bounteous.com/canada/node/63166/?lang=en-ca
https://dzone.com/articles/guide-for-htmlcss-developers-creating-layout-per-w
https://www.aditus.io/patterns/multiple-navigation-landmarks/
Renjith Raju
#Technology | 4 Min Read
Transit Gateway is a highly available network gateway featured by Amazon Web Service. It eases the burden of managing connectivity between VPCs and from VPCs to On-premise data-center networks. This successfully allows organizations to build globally distributed networks and centralized network monitoring systems with minimal effort.
Earlier, the limitations with VPC Peering made it unable to create or connect VPN connections to On-premises networks directly. Also, to use transit VPC, a VPN Appliance had to be purchased from AWS Marketplace and connect all the VPCs to On-premise networks. This increased both the cost and maintenance.
Advantages of AWS Transit Gateway
  • Transit Gateway is highly available and scalable.
  • The best solution for hybrid cloud connectivity between On-premise and multiple cloud provider VPCs.
  • It provides better security and efficiency to control traffic to various route tables.
  • It helps to manage the AWS account routing globally.
  • Manage AWS and On-premise network using a centralized dashboard.
  • This helps to protect against distributed denial of service attacks and other common exploits.
Cost comparison between Transit Gateway vs VPC peering
VPC Peering Transit Gateway
Cost per VPC connection None $0.05/hour
Cost per GB transferred $0.02 (0.01 charged to sender VPC owner and 0.01 charged to receiver VPC owner) $0.02
Overall monthly cost with 3 connected VPCs and 1 TB transferred Connection charges – $0
Data Transfer Cost -$20
Total = $20/month
Connection charges- $108
Data Transfer Cost – $20
Total = $128/month
Transit gateway design best practices:
  • Use a smaller CIDR subnet and use a separate subnet for each transit gateway VPC attachment.
  • Based on the traffic, you can restrict NACLs rules.
  • Limit the number of transit gateway route tables.
  • Associate the same VPC route table with all of the subnets that are associated with the transit gateway.
  • Create one network ACL and associate it with all of the subnets that are associated with the transit gateway. Keep the network ACL open in both the inbound and outbound directions.
HashedIn, a cloud organization has a master billing account, logging, security, hosting networking infrastructure, a shared services account, three development, and one production level account for the architecture below. AWS Transit Gateway is the single point for all connectivity.
For each of the accounts, VPCs to Transit Gateway are connected via a Transit Gateway Attachment. Each account has a Transit Gateway Route Table, with an appropriate Gateway Attachment that sends traffic, and hence, subnet route tables can be used to connect from other networks. The Network account transit gateway is connected to the On-premise data center and other networks.
Here are the steps that are observed to configure multiple AWS accounts with AWS Transit Gateway:

  • Firstly, access the AWS Console
  • Up next, create the Transit Gateway
  • Lastly, create Transit Gateway Attachment
    • VPC
    • VPN
    • Peering Connection
The three available options while creating Transit Gateway Attachment:
Using VPC, an ENI is created in multiple availability zones. A TGW attachment is needed to be developed in all availability zones in the VPC so that TGW can communicate with the ENI attachment in the same availability zone.
  • Create Transit Gateway Route Table
  • Add the routing rule for the respective TransitGateway ID
  • Create an association and attach TransitGateway attachments
  • Create static routes
Transit Gateway Attachments are associated with the Transit Gateway Route Table. However, you can create multiple attachments related to a single route table. Propagation will dynamically populate the routes of one attachment to a route table of another attachment. Associations help to attach Transit Gateway Attachments.
In addition, the network manager helps in reducing the operational complexity of connecting remote locations and other cloud resources. It also acts as a centralized dashboard to monitor the end-to-end networking operational activity in our AWS account.
Conclusion
The Transit Gateway is a centralized gateway where we can manage AWS and On-premise networks on a single dashboard. It also helps simplify network architecture, which was earlier complicated in managing inter-VPC connectivity and Direct Connect.
Priyanka Upadhyay
#Business | 5 Min Read
Agility is one of the most innovative and effective project management methods that is widely used by companies right now. But when it comes to implementing this methodology, we are always at a cross path while balancing fixed-cost projects and also managing them in the agile environment. Many aspects have to be considered, from cost overrun to scope derailed, quality getting impacted, the team’s workload, lack of motivation, etc. Let’s go through some practices that can help us take fixed-cost projects in sync with the Agile way of development and come out as a success.

Resource Planning
We all aim towards getting maximal or optimal profit when working on a project. Keeping your cost in control is an essential requirement in all projects, especially in the fixed-cost ones that thrive in an agile environment. The primary cost driver in a project is the team that is working on it. Hence, how you go ahead with your resourcing plan affects the success and profitability of the project. This includes enhancing productivity and the establishment of expertise that complement the demand of the project. Here are some key features:

  • Form: Instead of going for a full-grown team at the kick-off phase, it is a better idea to get going by forming a core-team of Project Owners at that phase. This step can be undertaken to get the initial understanding of the project requirements, analyzing critical project decisions and processes, and deciding on a framework.
  • Expand: Add in Module Owners.
  • Full Grown: Get the complete team set up.
  • Right Size: For a smooth and hassle-free movement of the project, the right size of the team based on the skill set is required, and the complexity of work is of paramount importance. Let go of the others to the resource pool to plan the whole project wisely.
Rebaseline Estimates
The unavoidable gap between project estimates and actuals have been a significant hurdle in fixed-cost projects. Though estimates must be influenced solely by the resources proposed, things don’t happen this way in reality. Thus, once the project begins, it is crucial to re-baseline the cost. We have one example to explain this point vividly.
Scenario:
Proposal Team Estimates (Presales): Cumulative estimates of 160 story points. The team forecasts an average velocity of 20 story points for a monthly iteration. The proposal assumed the team to be fixed for the duration and came up with an estimated eight months.1

Delivery Team Estimates: Actual implementation team came up with entirely different baseline estimates. Their estimation was 180 story points. With the staffed resources, the team predicted a 15 point velocity per iteration, estimating completion in 12 months.

Iteration

In the above scenario, at the start of the project itself, we got a fair idea of the gap, and that helped us take measures that would ensure a reduction of the gap. Some of them could be in terms of training and upskilling to guarantee velocity, looking for ways to optimize the team’s productivity – reusable codes and getting things right the first time. The gap could be reduced even with a correct blend of developer mix in terms of experience and expertise – having two juniors and right performing individuals rather than a senior high experience individual. Evaluating the scenario with the client and trying to get their buy-in could also be a possible option.

Follow the Flexibility Matrix
A flexibility matrix is a useful tool that facilitates conversations around project trade-offs and helps analyze the client’s priorities across the various project constraints, namely scope, schedule, cost, quality, etc. This makes understanding each knowledge area’s importance in a project more accessible and facilitates assessing the room for flexibility in playing around with them during decision-making. While choosing between GTM versus 80% code coverage, project objectives according to the client’s business needs should be prioritized.

Use of a flexibility matrix

Communication & Visual Workspace
We can use communication mediums like storyboards and release walls to communicate the overall project. Communication is one of the critical elements in agile, and when a fixed-cost project is discussed, a healthy communication plan is crucial to its success. Interaction with stakeholders and clients should be smooth, and regular updates are a must during meetings. Product backlogs must be prioritized, and the project manager must clarify every doubt of the team. Here are some key features of an exemplary communication model.

  • Storyboard: This provides information on the current sprint, the tasks and features at hand. It also guides to evaluate progress, rendering visibility at sprint level to all stakeholders. This should be updated everyday to maintain transparency in the team.The Storyboard
  • Release Wall: The Release wall gives the whole perspective of the project at a glimpse. With its support, we can determine the project’s present status and its scope in the near future.The Release Wall

Maintaining agility in fixed-cost projects can be a daunting task, but with concrete planning and strong project management skills, tackling the challenges is possible. Identifying the risks and handling them upfront is one of the most compelling and feasible ways to maintain smooth operations while working with fixed-cost projects in an agile environment.

References
https://www.pmi.org/learning/library/firm-fixed-price-agile-projects-6231
https://www.pmi.org/learning/library/firm-fixed-price-agile-projects-6231
https://www.pmi.org/learning/library/firm-fixed-price-agile-projects-6231
https://www.cs.colorado.edu/~kena/classes/5828/s12/lectures/21-agileprojectexecution.pdf
https://www.cs.colorado.edu/~kena/classes/5828/s12/lectures/21-agileprojectexecution.pdf
Judelyn Gomes
#Business | 6 Min Read
The enormous growth in technology has revolutionized our shopping experience in every possible way. With varied e-commerce portals and mobile apps, we are just a click away from ordering and receiving our favourite products. Online shopping platforms have made it convenient for individuals to choose where they can shop, purchase, and return across all retail channels with ease. The evolution in technology has influenced the evolution in the world of commerce too. What it takes to make a retail enterprise successful has expanded far beyond the basics of the product, brand, or single touchpoint.

With the pandemic sending a wave of uncertainty, retailers are forced to meet customer expectations without any compromise. To make this expectation a reality, retailers are pushed to integrate their processes and systems as well as provide seamless customer experiences that allow shoppers to seamlessly move across the various touchpoints (online, in-store, mobile, web, etc.). This paved the way for Unified Retail Commerce.

Unified Retail Commerce is a platform that combines e-commerce, m-commerce, order fulfillment, inventory management, customer relationship management, Point of Sale (POS) capabilities, etc under a single roof. According to Brian Brunk, Principal at retail industry consulting firm Boston Retail Partners (BRP), “Unified commerce goes beyond omnichannel, putting the customer experience first, breaking down the walls between internal channel silos and leveraging a common commerce platform.”

Unified Commerce

Components of a Unified commerce strategy:

Components of Unified commerce strategy

Unified commerce strategy constitutes of four major components, they are:

  • Interactions:As a retailer, it can become extremely challenging to predict a customer’s behavior or their journey, especially when there are numerous products. In unified commerce, the customer interacts with different products and services all under one platform. It is therefore important to record the customer’s interactions which would help with a better insight into how a business should be tailored to provide the best customer experience.
  • Channels:The channel provided to the customers should be hassle-free and satisfactory. The major notion behind unified commerce is to ensure that the customer gets a pleasant experience irrespective of whether they stay on the channel or not. A customer should be able to access all information be it in person or online. For example, having the same promotions available in person, online, or through a mobile app, which minimizes the customer’s search for better deals
  • Systems:It is mandatory that the systems work cohesively with one another. Implementation of an ERP system is one of the components that help to achieve a unified integration of both the processes and systems. With systems being linked together they are able to enhance a smooth customer journey from the beginning to the end.
  • Products:In a unified commerce ecosystem it is essential to provide accurate information about the products and services which are the cornerstones of the customer’s journey. Along the buying process, it is of utter importance to unify your product’s information across all channels so that customers get the right information irrespective of the channel they are visiting.
How a unified commerce platform can help a retail organization?
The unified commerce platform has become a boon to both retailers and customers. According to the National Retail Foundation,

  • “In the near-term, 53% of retailers surveyed across markets plan to implement a unified commerce platform to consolidate key data elements, business rules, and functionality historically housed in multiple systems.
  • Australia 60%, US 53%, and Europe 51%. This outlook represents a nearly 50% increase year-over-year.
  • Over the next ten years, 86% of surveyed retailers plan to leverage a unified commerce platform, making it the emerging standard as technology matures.”

According to the Boston Retail Partners 2018 POS/Customer Engagement Survey of 500 top North American retailers, 81% of retailers will have deployed unified commerce by the end of 2020.5 Here’s how unified commerce can enable an organization’s growth:

  • Growth in sales:Organizations are turning towards a unified commerce approach as it is consistent in leveraging the customer’s experience. This has made the buying and shopping process simpler for anyone to use. Customers are at the leisure of doing what they are comfortable with and in return organizations are able to gain their momentum in sales. According to Channel Advisor, “eBay, Amazon and other marketplaces such as Rakuten.com Shopping, Newegg, Sears, and others account for about a quarter of total online retail sales, and are continuing to grow.”
  • Customer experience, the center of attraction:Gartner reports that 89% of retail businesses believe that by 2020, most competition will center their businesses around the quality of the customer experience.7 Strategize the organization’s growth based on customer feedback and interaction. Keep in mind that the customer’s shopping experience is the key to launching an organization, as they interact with an organization through varied channels they expect accurate information and quality products, irrespective whether they shop online, in-store, or get it done through a call center.
  • Strong digital presence: Digitalization is the new normal that is driving the growth of an organization. Unified commerce, being digitally sound enhances an organization’s presence in the digital world. According to goecart, “Digital interactions influence approx. 36 cents of every dollar spent at retail stores (worth $1.1 trillion)”.
  • Clear forecasting:The data collected from various sources allows the companies to build their marketing and sales strategies and revenue-generating resources based on the acquired information or data.

Unified retail commerce has managed to blend the concept of both offline and online shopping experience and make it more accessible to customers all over the world. With the current scenario of social distancing and the fear to venture outside, a unified platform is a one-stop solution that caters to customer’s needs and digitizes stores for a better experience. It has managed to differentiate factors and power in-store experience, develop a closer bond with the customer by connecting with them on varied channels, and improve massive operational efficiency. Unified retail commerce helps overcome today’s challenges and enables a roadmap to deliver a seamless retail experience that customers expect.

References
https://www.lsretail.com/blog/unified-commerce-need-retail-business
https://medium.com/@paldesk/what-is-unified-commerce-and-how-it-will-help-your-business-b6ef9dea0d18
https://medium.com/@paldesk/what-is-unified-commerce-and-how-it-will-help-your-business-b6ef9dea0d18
https://www.retailsupplychaininsights.com/doc/why-unified-commerce-platform-is-a-must-for-business-growth-0001
https://www.cleo.com/blog/knowledge-base-unified-commerce
https://www.retailsupplychaininsights.com/doc/why-unified-commerce-platform-is-a-must-for-business-growth-0001
https://www.sikich.com/insight/6-ways-a-unified-commerce-system-can-completely-transform-your-retail-business-part-1/
https://www.retailsupplychaininsights.com/doc/why-unified-commerce-platform-is-a-must-for-business-growth-0001
Manish Dave
#Technology | 4 Min Read
Objective
Picture yourself in possession of a sample web application and HA (Multi Zone) Kubernetes cluster and in need of a high availability of the application using kubernetes cluster. Well, the first move that any K8s expert would do is to deploy applications into nodes, but there is a lack of assurance that each node has at least one pod running. This blog gives you a clear solution to the above scenario.

Kubernetes allows selective deployment of new pods using affinities. These can be good solutions to common HA problem statements like –

  1. n number of pods each node
  2. Ignore a certain node group for a group of Pods
  3. Preferred regions, AZs or nodes to deploy auto-scale pods

Let’s discuss all these in detail. Before proceeding please find below some of the common terminologies that would be used in this blog.

  • podAffinity: can tell the scheduler to locate a new pod on the same node as other pods if the label selector on the new pod matches the label on the current pod.
  • podAntiAffinity: can prevent the scheduler from locating a new pod on the same node as pods with the same labels if the label selector on the new pod matches the label on the current pod.
  • weight: can be any value from 1 to 100. The weight number gives the matching node a relatively higher weight than other nodes.The more you want your preference to be fulfilled, set weight to a higher value.
  • topology: can be defined as node labels
  • requiredDuringSchedulingIgnoredDuringExecution (HARD): using this approach deployment will choose the node only if the rule is satisfied. As a result only one pod will be deployed to each node, the next pods will be in pending state.
  • preferredDuringSchedulingIgnoredDuringExecution (SOFT): using this approach deployment will first prefer the nodes which satisfies the rule and if none exists it will deploy to non preferred nodes. Along with weight in the deployment we can deploy pods which will get evenly distributed among nodes.
    So, a rule of podAntiAffinity with SOFT scheduling will do the task here!

    First, let’s have a look at the deployment of the below yaml file which uses podAntiAffinity with a replica count of 3.

Deployment with soft podAntiAffinity:

apiVersion: apps/v1
kind: Deployment
metadata:
 creationTimestamp: null
 labels:
   run: nginx
 name: nginx
spec:
 replicas: 3
 selector:
   matchLabels:
     run: nginx
 strategy: {}
 template:
   metadata:
     creationTimestamp: null
     labels:
       run: nginx
   spec:
     affinity:
       podAntiAffinity:
         preferredDuringSchedulingIgnoredDuringExecution:
         – podAffinityTerm:
             labelSelector:
               ;matchExpressions:
               – key: run
                 operator: In
                 values:
                 – nginx
             topologyKey: failure-domain.beta.kubernetes.io/zone
           weight: 100
     containers:
     – image: nginx
      name: nginx
      resources:
         limits:
           memory: “200Mi”
           cpu: “200m”
         requests:
           memory: “100Mi”
           cpu: “100m”
status: {}

This antiaffinity rule ensures that two pods with the key run equals nginx must not run in the same node. This deployment was deployed to a 3 master node HA cluster.

Result:
As you can see here, each pod is deployed to each k8’s node (master1, master2 and master3).

We have already seen what will happen with the deployment and now will see what happens while scaling the deployment?

Scale the replica to a higher count, let’s say 6!

kubectl scale deployment nginx –replicas=6

Result:
As you can see the next set of 3 pods also got distributed evenly over the nodes. Each node has 2 pods of nginx running.
Will the same work with HPA (Horizontal Pod Scaling) or not?
Configure the HPA and create a load generator pod which hits the application endpoint with multiple requests so that it can trigger the scaling.

Now as you can see the newly launched pods are successfully distributed among the nodes.

Result:
podAffinity is therefore the ultimately easy solution for a high availability of deployments. We hope this blog has helped you understand the importance of podAffinity.
Pooja Joshi
#Business | 5 Min Read
Prior to indulging in an exquisite meal, one would want to choose a fine wine to go along with it. The more mature your wine is, the better it tastes. When it comes to wine, age is preferred but it is not the same when it comes to your software. For example, your current version of Windows has thousands of updates and is far more capable than its previous versions. The system you choose to invest in becomes the very foundation of your business. Application modernization is the tool that bridges the gap from where your team is, to the modernized approach you want them to adopt.

Legacy application modernization is essentially a project designed for creating new business value from diverse aging and existing applications. This is achieved by opting to update and equip applications with contemporary features and capabilities.

As organizations are forced to continually reassess their array of applications to determine whether they still support and provide adequate business value — application modernization updates your legacy application that best scales your company’s robust architecture.

Most businesses across all industries are turning to cloud-based architecture to support their IT workloads. According to a survey by LogicMonitor, it is assessed that more than 83% of the workloads of the IT enterprises will be based on the cloud by 2020.1 Most organizations are starting to pay attention to legacy application modernization to cope with the accelerated development in IT architecture. The reasons for the accelerated transformation are:

  • Lagging on speed
  • Old fashioned UI/UX
  • Difficult to update software
  • Security issues
Importance of Application Modernization
Before we start exploring the keys to application modernization, it is crucial to know its importance. The systems you choose to invest in, become the very foundation on which your business works and with the right systems and tools in place, your business can reach its full potential.

Some of the significant benefits of application modernization are as follows:

  • Cost Reduction:
  • When you move from the local premise to cloud-based solutions, you are likely to experience more than 5x reduction in cost. Organizations spend more than 60% of their budget on a legacy system and the move to cloud-based architecture can significantly reduce cost.2

  • Improved business agility:
  • Companies with modernized applications stand a better chance to serve their vendors and customers efficiently because they usually find it very hard to develop new features with their legacy system. Applications that have been redesigned in order to suit distinct business requirements ideally have well-managed databases, better coding structures, and highly flexible applications.

  • Better Security:
  • One of the most important reasons why you should consider modernizing your existing legacy system is better security and protection of your critical business transactions.

    Considerations to App Modernization
    Most organizations choose to run on legacy applications and ancient settings because they are not willing to take the risk of cloud-based infrastructure. The application modernization process can be difficult without having well-strategized approaches that will have a significant business outcome.

    Fundamentally, organizations setting out on any application modernization activity should take out the time to dissect the heritage framework they need to refactor. As the application should be repurposed into an exceptionally strong SaaS application, organizations need to thoroughly examine their working frameworks and system servers to guarantee they are equipped for the same.

    Cloud Suitability Assessment
    Before the suitability assessment, a general plan is chalked out for successful migration to the cloud. In general, application migration or transformation to the cloud is the process of redeploying an application on a newer platform. This foundation is administered by the Cloud Service Providers (CSPs) to determine resilience capacities, stable economic, operational cost, and further scalability.

    To plan for a successful migration to the cloud, organizations must first implement a cloud suitability assessment, based on business and technology requirements, to determine which of their on-premise applications fit in the cloud.

    The Urgency of Application Modernization
    The principal target of modernizing your applications is to adjust application capacities with that of your corporate methodology and business work processes. For a modernization project to be truly successful, the IT goals should be aligned with the business goals of an organization.

    If you decide to continue with your legacy application, your efficiency and business output will pale in comparison to your competitors, as they will move on to greener pastures through modernization and the latest infrastructure. Even if you optimally work on a cloud-based foundation now, chances are that new features or updates may cause your application to break down in the near future.

    Therefore, it is important for your business to leverage the optimal capacity of cloud-based technology while ensuring that its system can competently sustain the evolving business requirements. A proper strategy must be put in place to make sure of this fact. This must involve the right elements for your business to have a flawless transition to application modernization.

    References
    https://www.logicmonitor.com/resource/the-future-of-the-cloud-a-cloud-influencers-survey/?utm_medium=pr&utm_source=businesswire&utm_campaign=cloudsurvey
    https://www.imaginovation.net/blog/why-businesses-should-modernize-legacy-applications/
    Judelyn Gomes
    #Business | 5 Min Read
    In our previous blog we focused on how product management is the key to surviving disruption. In continuation with our series, this blog emphasizes on the importance of data-driven decision making in product management. Experts widely agree that data plays a vital role in the development of the product as well as the company. Every product manager at some point of time talks about data driven product management but in what way would data-driven decisions enhance the product management journey?
    The importance of data-driven decision making in product management
    Data-driven product management has become the talk of the industry lately. The astonishing fact is that product managers have been following a data driven pattern for quite a while. Long gone are the days where gut feeling mattered, data is now every individual’s new best friend. How important is data driven decision making?

    • Every valuable insight obtained from data helps the organization take the right decision.
    • Helps the organization realize what their customers need and scrutinize any existing problem.
    • Decisions based on a steady stream of data will induce more factual decisions rather than instinctive decisions.
    • Helps structure the growth of both the product and the organization.
    • Ensures a better plan in place for building new products.
    • Helps to increase the efficiency of product managers and enables them to jot down a plan for smooth launch of the product in a creative and effective way.
    • Tracks the evolution of product management.
    • Better management of stakeholders and align the product team.
    • Decisions taken based on data helps in better forecasting, marketing, business decisions and other core strategic goals.
    • Data-driven concepts and decisions help leverage data for the organization and identify new revenue streams.
    Data-driven product management life cycle
    Gartner predicts that by 2021, some 75 percent of independent software vendors will embed software usage analytics in their products to inform product management decisions and measure customer health.1 A data-driven product management life cycle constitutes of the following:

    Problem solving and product assumptions: Problem solving is the key feature of any product management cycle. The major decision that needs to be taken into account is the criticality of the problem and the time involved in solving the same. As we ideate to make a better decision, the one that is backed with data always takes the upper hand. The potential features and assumptions of the product can be built on the data collected and improved by validating the same. Data lies as the base of any product story-line and is the foundation for any product strategy.

    Sampling inspection and feedback: The product samples should be inspected based on the data collected and the feedback received from the customers. The best way to improve a product and its sales is by implementing the feedback received from the customers and making the product a customer satisfactory one. According to Product Craft, “A number of software companies offer design partner programs where they recruit customers to test and validate the features of a product before the product gets announced as GA (general availability) in the marketplace.”2

    Data analysis: Data analysis or data evaluation plays an important role in determining the growth of a product. The data collected from customer experience or usability improvement are certain qualities that adhere to professionalism in building a product with facts. It is necessary to maintain a centralized database and enhance a structured data governance for a better outcome of the product and the organization. An analysis of the data collected helps the organization to determine a roadmap which would help them reduce risks and avoid operational losses.

    Tracking and reporting data in the right format: Tracking and reporting of data paves a way for the sales and marketing teams to determine and build a pipeline to launch the product in the market. The data collected should be formulated in the right format as it would help with metrics of usage frequency and retention that are two major key factors that determine the sale of the product. To take a data-driven decision, it is important to keep in mind the following factors, customer conversion, product retention, and engagement scores.

    How can your organization become data-driven?
    According to Product Management Insider, “91% claimed it’s important or very important to make data-driven decisions, but only 57.4% often or almost always do so.”3 Being data-driven helps navigate the growth of an organization, here’s how:

    • Acquire the art of positioning your vendors. Gartner uses the “Magic Quadrant” methodology in their research to show the market positioning of vendors. The vendors involved should be segregated based on their ability to execute and how much they can contribute towards accomplishing the organization’s vision.
    • List down potential benefits of your data which can directly impact the productivity of the organization.
    • Use your data as a tactical prediction tool and invest in collecting the right data, this would help you tailor your decisions based on the obtained predictions.
    • Improve your data tech stack regularly, this helps your organization stay updated and make use of the relevant technologies.
    • Maintain a repository of data (a single source of truth), and later make it accessible to everyone in the organization and encourage them to work based on it.
    • Provide data preparation tools such as Altair, Infogix, etc., that help enhance data driven decisions.
    • Adhere to a goal-oriented approach and implement modern data governance practices.

    The journey to become a data-driven organization can evolve. But the major concern is the momentum lapses that happen once the project is over. The key to growing your organization is to be patient, disciplined and endure the strenuous yet fruitful path to becoming a data-driven organization. As W. Edwards Deming quotes, “Without data, you’re just another person with an opinion.”

    References
    https://www.dataversity.net/characteristics-data-driven-product-management-development/#
    https://productcraft.com/best-practices/the-four-keys-to-data-informed-product-managemen
    https://medium.com/pminsider/the-role-of-decision-making-in-the-product-development-journey-cb3f0cbac484
    Pooja Joshi
    #Business | 6 Min Read
    Businesses that are agile, can rapidly adapt to internal and external market changes, without compromising on quality. They are quick to respond and are flexible to customer demands. Rather than being “a machine”, an organization is more like “an organism”1 that is continuously evolving by being adaptive, creative, and resilient.

    To be agile, we need to first understand what exactly it entails.

    For organizations to be agile, they need to have the following key characteristics

    1. Strategy: Shared purpose and vision.
    2. Structure: Closely knit, empowered teams that take extreme ownership, stable top-level structure, but the rest of the organization has a flexible hierarchy – a flat structure
    3. Process: Continuous learning and sharing, rapid experimentation and thinking, efficient decision making, and practice radical honesty
    4. People: Cohesive community with a common culture, people at the center
    5. Technology: Digitization of organizational processes, integration of next-gen tech development and delivery practices into the business
    Post-Pandemic Agility
    How do we ensure that innovation stays consistent, pandemic, or not? Do we have to constantly be pushed out of our comfort zones to think radically?

    Truly agile organizations don’t see a trade-off between flexibility and stability. In their early years, start-ups are known for their dynamism, but as they scale they are bogged down by bureaucracy3.

    Transitioning to the agile way of being isn’t simple, it will require training, behavioral change, and new technology. Thus, leadership executives need to decide whether this shift is worth the effort and cost. Budget limitations, availability of manpower, return on investment, cost of delays, risks, inter-dependencies among other teams, customer and employee feedback have to be kept in mind before making the move to agile4.

    Moreover, agile will not work in certain conditions5. Let’s look at some of these constraints:

    • Market Environment: stable and unpredictable
    • Customer involvement: clear requirements, unavailable for constant collaboration
    • Innovation type: clear solutions based on previous work, working in silos
    • Modularity of Work: parts can’t be tested until the product is complete, changes are not accommodated
    • Impact of interim mistakes: they could be disastrous and are not seen as learnings
    Long-term Agility
    It is fair to say that agile isn’t for every organization. However, agile can help organizations to optimize their processes and have better control of their operations. Being agile means to constantly walk that tightrope between structure and innovation. If done right, it can do so without compromising on efficiency and help organizations to constantly improve and think outside the box.

    Having said this, when working remotely is our new normal, can teams embrace agile? The answer is a resounding, yes. Now, that we don’t really have an option, here are some ways to be agile, while working remotely

    • Prioritization of tasks
    • Having small, cross-functional autonomous teams
    • Frequent meetings that ensure that everyone is aligned and are collaborating
    • Agile leaders that focus on concrete output and goals, who enable teams to work towards a common vision
    • Building a virtual sense of culture, through empathy, transparency, and engagement

    Let’s take a look at organizations that are doing agile right:

    Spotify
    Spotify has organized their teams and “squads”, of 4 to 6 people. They are part of a larger “tribe” that then fit loosely into “chapters” and “guilds”. Squads are fully autonomous and work independently. These squads can be thought of as independent start-ups, that engage in knowledge sharing but don’t collaborate often. Despite the volatile nature of the structure, chapters and guilds help the organization to build their community and corporate culture. Spotify has consciously decoupled their teams where possible to avoid scaling issues, to achieve agility. Guided by agile coaches, squads are able to innovate and act quickly, without conforming to any hierarchies.
    HashedIn
    Now that the circumstances have forced everyone to work remotely, one can say that HashedIn has been prepared since its values were able to align precisely to build a culture of agility. During these times, having #Units that act as independent entities ensures that having smaller high-performing teams can adapt to the fast-changing requirements of clients. Despite the change in business operations, every team is driven by delivering customer delight. Incorporating customers’ iterations are seen as the building blocks that will ensure a competitive advantage. Teams that are highly motivated to get the job done, in a short amount of time is the key. Being agile ensures that self-organizing teams have the independence to play to their strengths. HashedIn actively cultivates an environment of support and constant learning. The end goal, when building software, whether from a business, design, developer, or end-users point of view, is that it should be sustainable. At HashedIn, teams are constantly reflecting to optimally build, transform, and launch innovative products for its customers across the globe. Being agile is to learn from each other; and it is this constant high paced learning environment that is the secret of this organization’s growth.
    Support from the Leadership Team
    The pandemic has forced many organizations to think on their feet, to come up with innovative ways of doing business. If they don’t swim, they will sink. The command-and-control style of management in unforeseen circumstances rarely works, and the top management has realized that being agile is the best way forward. However, being agile is more than just sporadic bursts of innovation, rather than a systematic way of doing things. Cultivating a dynamic work environment, where constant change is looked as an opportunity rather than a barrier, is the best way forward. Being given the freedom to experiment and falter and/or succeed is part of the process. The top management has a key role to play. Giving their teams this freedom is what will lead to innovation and will warrant risks that will pay off.
    References
    https://www.mckinsey.com/business-functions/organization/our-insights/the-five-trademarks-of-agile-organizations
    https://www.mckinsey.com/business-functions/organization/our-insights/the-five-trademarks-of-agile-organizations
    https://www.mckinsey.com/business-functions/organization/our-insights/agility-it-rhymes-with-stability
    https://hbr.org/2018/05/agile-at-scale
    https://hbr.org/2016/05/embracing-agile
    https://www.bcg.com/publications/2020/remaining-agile-and-remote-through-covid.aspx