I think Cognizant is among the modern organizations that perpetuate the creation of an inhuman organizational culture. After reading the text and watching the video, I have realized that Cognizant underpays its workers, micro-manages employees, fires employees for making just several errors, lets employees have sex and see traumatic images and videos to cope with traumatic contents and sits back and watches former workers develop symptoms that mimic Post Traumatic Stress Disorder (PTSD). Based on this information, I believe that Cognizant is only concerned about the monetary benefits it derives from the employees but cares less about the psychological and emotional disturbance their work ethic has on the employees.
Facebook uses an almost similar approach to the one practiced in Cognizant. In 2018, Facebook was forced to employ additional content moderators to deal with the violent and exploitative contents in the site. The organization heavily relies on part-time employees since they are cheaper to maintain. In turn, this increases the profit margins that Facebook makes every year. I wonder why a large organization like Facebook would not be concerned about its employees' wellbeing, considering that the same workers enable the farm to make billions of dollars in profits annually. I think this is the highest level of greed to make profits at the expense of the employees’ psychological wellbeing. This is not fair since the approach is doing more harm than good to the employees who are at the core of running the firm due to their human labor.
Delegate your assignment to our experts and they will do the rest.
Until big organizations such as Cognizant and Facebook learn the need to enhance equality and provide humane working conditions, content moderators will continue to be a discriminated lot in the labor market. These organizations must understand the extent of psychological damage the content moderators face due to exposure to traumatic information. This would enable the organizations to put in place treatment interventions that allow content moderators to cope with the short-term and long-term effects of the content they watch daily.