top of page

TalkLife - Understanding Our Commitment to Safety in Discussions of Self-injury, Suicide, and Eating Disorders

Last Updated - June 2024

This section covers sensitive topics, including self-harm, suicide, and eating disorders, which may be distressing. Please consider reading further only if you feel comfortable, as it’s important to understand our safety protocols. If it helps, you may want to have someone you trust accompany you as you read this information.


Purpose:

 

At TalkLife, we prioritise creating a safe environment where our members can share their experiences and support each other, especially when discussing sensitive and potentially distressing issues such as self-harm, suicide, and eating disorders. This type of content may be harmful to young people and other vulnerable users, and we therefore monitor and moderate this content, as well as provide signposting to crisis support options and resources.

 

Our community guidelines let our users know what is and what is not ok to post, and are designed not only to prevent harm but also to foster a supportive and understanding community, complying with online safety regulations and adhering to best practice in mental health support.

 

Understanding the limitations of our platform is crucial. TalkLife is an online peer support community, not a clinical or crisis service. We can not assess, diagnose, or track an individual's mental health, nor can we make clinical assessments of our users. Our aim is to provide a space for sharing and mutual support, although it can be complementary in some instances it cannot replace professional help. If you find yourself in a crisis situation, we urge you to contact local emergency services or a professional mental health service. Additional support can also be found in the 'I Need Help' sections of our platform.

 

Ensuring online safety is important not only to comply with regulations but also to protect our community's wellbeing. The UK Online Safety Act (2023) mandates platforms to moderate content that may be harmful, including content that encourages, promotes, or provides instructions for suicide, self-harm, and eating disorders. By adhering to these guidelines, we prevent the normalisation or glamorisation of harmful behaviours and create a safer online space. If you post content that falls into these areas, you will be signposted to helplines and resources, and the content will be removed from the platform. Our commitment to best practices and regulatory compliance underscores our dedication to the safety and wellbeing of our community members.

 

 

Why These Guidelines Matter:

 

Engaging in discussions around sensitive topics like self-harm, suicidal thoughts, and eating disorders can be helpful for some users. However,  some content around these topics may be harmful for other users, especially for those who are vulnerable or currently struggling. By moderating these conversations, we strive to strike a balance between allowing users to express themselves and protecting vulnerable users from potentially harmful content, and to mitigate the risk of real-world harm.

 

Our Goals:

 

  • Promote Help Seeking: Encourage messages that guide individuals towards seeking further peer community support or in some cases professional help.

  • Share Stories of Hope: Foster an environment where stories of recovery, support, and resilience are shared.

  • Offer Self-Care Tips: Providing and receiving practical advice on self-care and wellbeing.

  • Provide Support Information: Share reliable information about sources of support and useful resources.

  • Encourage Safe Sharing: Promote sharing feelings, emotions, and coping strategies in a supportive and responsible manner that benefits everyone.

 

By following these guidelines, we can create a supportive and safe community for everyone.

Evidence And Best Practice In Supporting The Protection Of Vulnerable Users:

While research is limited, some evidence suggests that young people may be at increased risk from viewing self-harm and suicide content online. Research conducted by Samaritans and University of Bristol found that 26% of young people who had presented to hospital for self-harm or a suicide attempt had used the internet in relation to this1.

Whilst some self-harm and suicide related content can be helpful for users and a part of their recovery, other users may also stumble upon content by accident or engage with it for more harmful reasons such as to find information about methods of harm.

Similarly, content about eating disorders can have a significant impact on those suffering from or vulnerable to eating disorders. People with eating disorders often feel that social media is exacerbating or 'fuelling the fire' of their illness.

We want users to be able to speak about difficult feelings that can be challenging to discuss face-to-face. However, content that talks about topics such as suicide, self-harm and eating disorders carry a risk to vulnerable users and we look to research and industry guidelines to help us protect them.

Current research and guidelines for best practice can be found here:

World Health Organization. (2023). "Updated Recommendations for Online Suicide Prevention."

Samaritans Media Guidelines

Beat Eating Disorder Media Guidelines

 

Compliance with International Legislation including the UK Online Safety Act (2023):

 

Our policies align with the UK Online Safety Act (2023) which mandates online platforms to remove illegal content and protect vulnerable users from potentially harmful content. This includes addressing risks related to content that could cause physical or psychological harm, such as self-harm and eating disorders . By following these guidelines, we not only meet legal standards but also demonstrate our commitment to the safety and wellbeing of our community.

 

What This Means for Us:

  • Content Moderation: We actively monitor and moderate content to remove or restrict harmful material.

  • Support Resources: We provide links and information to professional support services for those in need.

  • Community Guidelines: We have clear community guidelines to ensure a positive and supportive environment.

 

What This Means for You:

  • Report Harmful Content: Help us by reporting any content you find harmful or concerning.

  • Engage Positively: Share your experiences and support others in a constructive and safe manner inline with our ‘Terms of Service’ and ‘Community Guidelines’.
     

As a global company, it is essential to comply with other international laws such as the  Australian Online Safety Act and the EU Digital Services Act. These regulations demand a high standard of compliance and an increased duty of care on companies such as TalkLife to address potentially harmful content, including content related to  self-harm, suicide and eating disorders.

 

By aligning our policies with these global standards, we strengthen our commitment to a safe online environment and ensure broader legal compliance. Furthermore, adopting best practices such as proactively monitoring content, educating users on safe online behaviours, and partnering with mental health organisations enhances our ability to protect our users. This comprehensive approach not only meets regulatory requirements but also fosters a supportive and safe online peer community.

 

How TalkLife Works to Proactively Protect Users and Promote Positive and Safe Peer-to-Peer Mental Health Support

 

At TalkLife, we prioritise the safety and wellbeing of our community by managing content related to self-harm, eating disorders and suicide through the following measures:

 

1. 24/7 Professional Admin Team:

Our Trust & Safety Team are professionally trained online moderators dedicated to ensuring the safety and security of our community. They operate continuously, 24/7, removing content that violates our 'Terms of Service' and 'Community Guidelines' and is deemed illegal or inappropriate.

 

2. Safety Screens and Supportive Messaging:

Before posting in self-harm and eating disorder categories, users are shown safety screens that include:

  • Supportive messages.

  • Proactive Signposting and encouragement to reach out for support.

  • Links to helplines and resources.

  • Suggestions for safe ways to express their feelings.

 

3. Content Removal and Support:

If content is removed by the Admin Team for breaching community guidelines regarding self-harm, suicidal actions, or eating disorder behaviors, users will receive:

  • A supportive message explaining the removal.

  • Encouragement to seek support.

  • Links to helplines and resources.

  • The option to message the admin team for further clarification on the removal.

 

4. User Reporting:

Users can flag posts and profiles for review by the Admin Team if they have concerns about the content or the wellbeing of another user.

 

5. Supportive Resources:

We provide resources written by mental health experts on eating disorders, self-harm and suicide, accessible through our safety screens prompts, links within our community guidelines, and also accessible at anytime on the Users Wellbeing Centre in the sidebar of our platforms.

 

6. Helpline and Emergency Numbers:

Helpline and emergency numbers are readily available in the sidebar for immediate assistance.

 

7. Content Recommender Systems:

We do not use content recommender systems that could push potentially harmful content to users.
 

We aim to create a supportive, safe, and positive environment for all TalkLife users. Your participation in reporting concerns and engaging positively helps us maintain this community standard.

Trigger Warnings - Purpose And Limitations:

 

Purpose of Trigger Warnings:

Trigger warnings are designed to alert users to content that may evoke a strong emotional response or discomfort due to past trauma or sensitive experiences. This allows individuals to prepare mentally or choose to avoid certain topics that might be distressing. They are especially crucial in discussions about self-harm, suicide, and eating disorders, where users may be particularly vulnerable.

 

What Trigger Warnings are not for:

While trigger warnings provide a prelude to sensitive content, they do not permit users to circumvent community guidelines. These warnings are not a safeguard that allows the detailing of prohibited methods, explicit descriptions, or any content that explicitly violates our guidelines.

 

Content Moderation with Trigger Warnings:

Trigger warnings do not exempt posts from moderation. If a post, even with a trigger warning, contains content that violates our community guidelines or encourages harmful behaviours, it will be removed. This approach is in line with our commitment to maintaining a safe online environment for all users, in compliance with the UK Online Safety Act (2023) and other applicable legislation.

 

Ensuring a Safe Community:

Our priority is to create a supportive and secure space where all members can engage without exposure to harmful content. The use of trigger warnings is part of our broader strategy to manage sensitive discussions responsibly. However, adherence to our community guidelines is paramount, and all content, regardless of warnings, is subject to review and moderation to prevent harm and ensure the well-being of our community members.
 


 

Policy Details:

 

 

1. Self-Harm/Self-injury

 

At TalkLife, we understand that some users seek support for self-harm and struggle to cope. Ensuring that discussions around self-harm and self-harming behaviours are supportive and not detrimental is crucial to fostering a healthy environment for recovery and support. It's crucially important that discussions around self-harm are conducted safely and in a way that does not trigger others in the community. Self-harm in this context refers to intentional injury to oneself, often as a way to cope with emotional distress. This could include but is not limited to behaviours such as cutting, burning or hitting oneself.

 

Trigger Warnings:

 

  • Users are encouraged to use a trigger warning when posting content related to self-harm. This can be done by selecting the "this post could be triggering" option under post settings. This helps others prepare themselves emotionally before engaging with the content or to avoid it altogether.

  • If a trigger warning is not added when the post is created, our Safety Team may apply this tag later.

  • Regardless of a trigger warning, content that breaches our 'Community Guidelines' will be subject to removal to maintain a safe and supportive environment for all community members. This also applies to content shared with mutuals.

 

What Users Can Post:

 

1. General Experiences:

Sharing personal stories in the present or the past related to struggles with self-harm, focusing on emotional experiences without detailing methods.

2. Coping Strategies:

Discussing general coping mechanisms that do not involve self-harm, such as mindfulness, exercise or talking to friends or therapists.

 

3. Seeking Support:

Sharing struggles with self-harm and seeking support in a positive way from the community, without sharing explicit details around methods.

 

What Users Cannot Post:

 

1. Graphic Descriptions, Details of Methods or Instructions for Self-Harm:

  • Posting specific methods or techniques of self-harm, as detailed descriptions can act as a how-to guide and encourage self-harm by normalising the behaviour.

  • Including any distressing details about the aftermath of a self-harm episode or the appearance of any injuries.

  • Sharing how to hide equipment or damage caused to the body relating to self-harm may help users to hide harm from the people around them and could delay someone getting support, and will be removed.

 

2. Glamorising, Promoting, or Encouraging Self-Harm:

  • Posts that make self-harm appear appealing or beneficial, as they can influence others to view self-harm as a viable or attractive option.

  • Encouraging other users to self-harm is clearly harmful and prohibited.

 

3. Intent or Planning:

  • Posts that state a user is about to self-harm or is imminently planning to do so, as they may be harmful to vulnerable users.

3. Acronyms, abbreviations or ‘algospeak’:

  • Code words or turns of phrase used to reference self-harm and evade moderation will be removed.

 

Why do we have these guidelines? 

 

  • We aim to foster a safe and supportive community where members can seek and offer help without exposing themselves or others to harmful content. 

  • Detailed descriptions and glamorisation of self-harm can lead to normalisation or imitation, particularly by vulnerable individuals.

  • Removing such content supports a safe environment and aligns with best practices for online safety, reducing the risk of harm among community members. 

  • As a global company, it is essential to comply with other international laws such as the UK Online Safety Act (2023), Australian Online Safety Act and the EU Digital Services Act. These regulations demand a high standard of compliance and an increased duty of care on companies such as TalkLife to address potentially harmful content. This comprehensive approach not only meets regulatory requirements but also fosters a supportive and safe online peer community.

 

Additional Considerations:

 

  • User Reporting
    Users are encouraged to report posts that do not adhere to these guidelines for review by the Safety Team.

  • Supportive Messaging
    We want to ensure that everyone in the community feels supported and safe therefore if a user's content is removed, they will receive a supportive message with encouragement to seek help and links to dedicate resources.

  • Resources and Helplines
    Users have a number of targeted resources and helplines for immediate support that can be accessed in the 'Wellbeing Centre' and sidebar of the platform.

 

 

 

2. Suicide


Conversations about suicide and suicidal thoughts must be managed with utmost care to ensure they do not inadvertently promote or trigger harmful actions. Suicide refers to the intentional act of causing one's own death, often involving planning and a sense of imminence. It is critical that any discussion around this sensitive topic is handled responsibly to protect all members of our community. While we understand that there may not be an intent to harm when sharing experiences related to suicide, including remarks made in a lighthearted or offhand manner, we will remove any content that falls under this category if it violates our Community Guidelines. This precautionary approach ensures that all discussions, regardless of intent, are handled with the utmost care to prevent any potential harm or triggering of other community members.

Importance of Managing Suicide Discussions:

 

  • Discussing suicide and suicidal thoughts openly can be a vital part of recovery and support. However, without careful management, such discussions can unintentionally encourage harmful behaviour or trigger vulnerable individuals, which could include the user themself. Therefore, it is essential to create a safe environment where users can seek help and share their experiences without risking further harm.

  • Peer support is invaluable but it is not a substitute for professional crisis intervention. We take our duty of care towards our community members seriously and strive to ensure that everyone has access to timely information and seamless signposting to appropriate support options.

  • We understand that for some, thoughts of suicide can be part of experiencing certain mental health difficulties. However, when these thoughts start to turn into plans of how to end one’s life and/or actions, professional help and support becomes crucial. Our aim is to ensure that all members of our community are signposted to crisis support helplines and other resources to navigate these crises effectively.

 

Trigger Warnings:

 

  • Users are encouraged to use a trigger warning when posting content related to suicide or suicidal thoughts. This can be done by selecting the "this post could be triggering" option under post settings. This helps others prepare themselves emotionally before engaging with the content or to avoid it altogether.

  • If a trigger warning is not added when the post is created, our Trust and Safety Team may apply this tag later.

  • Regardless of a trigger warning, content that breaches our 'Community Guidelines' will be subject to removal to maintain a safe and supportive environment for all community members. This also applies to content shared with mutuals.

 

What Users Can Post:

 

1. General Experiences of emotional and/or mental health difficulties  

Sharing personal stories and feelings related to struggles with mental health and suicidal feelings focusing on emotional experiences and ways in which you might find support from the community helpful. If we have concerns that an individual has intentions to harm themselves, we will remove the content.

 

2. Stories of Hope, Support, and Recovery

Sharing personal stories of overcoming suicidal thoughts and experiences including feeling suicidal in the past, focusing on recovery and resilience.

3. Promoting Help-Seeking

Messages that encourage seeking help from mental health professionals, crisis helplines, or support groups.

4. Self-Care Tips

Tips on self-care and maintaining well-being, such as mindfulness practices, physical exercise, or connecting with loved ones.

 

5. Support Resources:

Information about available sources of support, including hotlines, counselling services, and online resources.

 

6. Safe Coping Strategies:

Sharing feelings and coping strategies in a safe and supportive manner, ensuring that the content is beneficial for all members.

 

What Users Cannot Post:

 

1. Suicidal Intent

Any post that suggests a user cannot keep themselves safe. This includes explicit intentions or plans to die, references to lethal means, or any details of methodologies. This policy ensures that individuals displaying signs of acute distress engage with crisis helplines, as peer support alone may not be sufficient.

 

2. Promotion, Encouragement, or Providing Instructions for Suicide

Any content that suggests suicide as a solution, romanticises the act of suicide, encourages users to end their lives, or provides instructions for suicide is strictly prohibited.

 

3. Acronyms, abbreviations or ‘algospeak’:

Code words or turns of phrase used to evade moderation will be removed.

 

Why do we have these guidelines? 

 

  • The platform is not a substitute for professional crisis services. 

  • Proactively removing posts that indicate a user is considering suicide is vital to ensure the user is directed to appropriate support. This action can help prevent possible harm or triggering of other vulnerable users. 

  • We aim to foster a safe and supportive community where members can seek and offer help without exposing themselves or others to harmful content. 

  • Removing such content supports a safe environment and aligns with best practices for online safety, reducing the risk of harm among community members. 

  • As a global company, it is essential to comply with other international laws such as the UK Online Safety Act (2023), Australian Online Safety Act and the EU Digital Services Act. These regulations demand a high standard of compliance and an increased duty of care on companies such as TalkLife to address potentially harmful content. This comprehensive approach not only meets regulatory requirements but also fosters a supportive and safe online peer community.


 

Additional Considerations:

 

  • User Reporting
    Users are encouraged to report posts that do not adhere to these guidelines for review by the Safety Team.

  • Supportive Messaging
    We want to ensure that everyone in the community feels supported and safe therefore if a user's content is removed, they will receive a supportive message with encouragement to seek help and links to dedicate resources.

  • Resources and Helplines
    Users have a number of target resources and helplines for immediate support that can be accessed in the Wellbeing Centre and sidebar of the platform.

  • Crisis Intervention

In cases where a user indicates an imminent risk of suicide, the Trust & Safety Team will provide urgent links to crisis services ensuring the user is aware of immediate help options.

d

3. Eating Disorders

 

Ensuring that discussions around eating disorders are supportive and not detrimental is crucial to fostering a healthy environment for recovery and support. Eating disorders, such as Anorexia nervosa, Bulimia nervosa, and binge-eating disorder, are serious mental health conditions characterised by severe disturbances in eating behaviour and body image. Routine monitoring of health by a doctor or health professional is vital due to the associated physical and psychological risks, including malnutrition, organ damage, and increased mortality rates.

 

In the following information, disordered eating behaviours refers to unhealthy eating patterns that may not meet the clinical criteria for an eating disorder but still pose health risks.

 

Importance of Managing Eating Disorder Discussions:

 

  • Managing conversations about disordered eating behaviours and eating disorders is essential to provide a safe space for individuals seeking support and recovery. It is important to ensure these discussions do not inadvertently encourage or normalise disordered eating behaviours.

  • The role of social media and societal pressures in promoting restrictive eating and unrealistic body standards can exacerbate these issues. We aim to counteract these influences by promoting healthy and supportive dialogue.

  • If we find or are alerted to content related to disordered eating behaviours or eating disorders, we will attempt to signpost and encourage users to access professional services for support.

 

Trigger Warnings:

 

  • Users are encouraged to use a trigger warning when posting content related to eating disorders. This can be done by selecting the "this post could be triggering" option under post settings.  This helps others prepare themselves emotionally before engaging with the content or to avoid it altogether.

  • If a trigger warning is not added when the post is created, our Safety Team may apply this tag later.

  • Regardless of a trigger warning, content that breaches our 'Community Guidelines' will be subject to removal to maintain a safe and supportive environment for all community members. This also applies to content shared with mutuals.

 

What Users Can Post:

 

1. Supportive Recovery Stories

Sharing personal recovery stories from eating disorders that focus on the emotional journey and support received.

 

2. Healthy Eating Practices

Discussing general healthy eating and lifestyle changes that promote overall well-being, emphasising balanced nutrition and self-care.

 

3. Coping Strategies

Sharing healthy coping strategies that have helped in managing triggers and stress, such as mindfulness techniques, journaling, and physical activities that promote well-being.

 

4. Positive Body Image

Promoting positive body image and self-acceptance, sharing experiences of overcoming societal pressures, and discussing ways to build and maintain a healthy self-image.

 

5. Seeking and Offering Support

Encouraging others to seek professional help and sharing information on how to access local and online support services. Offering emotional support and encouragement to fellow community members who are struggling.

 

6. Educational Content

Providing factual information about eating disorders, including symptoms, risks, and the importance of seeking medical advice. Sharing insights from reputable sources to help educate the community about the dangers of disordered eating and the benefits of recovery.


What Users Cannot Post:

 

1. Specific Dieting Measures and Weight Details

  • Posts that include specific numbers, such as dieting measures, calorie counts, weight, or body measurements. This information can be triggering to others who are susceptible to or recovering from eating disorders. Avoiding these details helps prevent the perpetuation of unhealthy comparisons and obsessive behaviours.
     

2. Encouraging, Promoting, or Providing Instruction for Disordered Eating Behaviours

  • Any content that glamorises, normalises, or encourages eating disorders will be removed. This includes:

    • Promoting extreme dieting behaviours or practices.

    • Idealising certain body types that promote unrealistic and harmful standards.

    • Providing step-by-step instructions or tips on how to engage in disordered eating behaviours.

    • Sharing "success stories" related to extreme weight loss or unhealthy eating habits.
       

3. Detailed Accounts of Disordered Eating Behaviours

  • Describing disordered eating behaviours in detail is strictly prohibited. This includes:

    • Explicit descriptions of methods used to restrict food intake, purge, or binge.

    • Detailed accounts of personal experiences with disordered eating that go beyond general emotional experiences and into the specifics of behaviours.

    • Any content that might serve as a guide or endorsement of unhealthy eating practices.
       

Why do we have these guidelines? 

 

  • Discussions involving specific dieting techniques, calorie counting, or detailed body measurements can trigger harmful behaviours in individuals with or recovering from eating disorders. By moderating this content, we prevent the promotion or normalisation of eating disorders and difficulties such as disordered eating.  

  • Removing such content supports a safe environment and aligns with best practices for online safety, reducing the risk of harm among community members. 

  • We aim to foster a safe and supportive community where members can seek and offer help without exposing themselves or others to harmful content. Promoting safe and supportive discussions helps counteract societal pressures and the harmful influence of social media, contributing to a healthier community environment.

  • As a global company, it is essential to comply with other international laws such as the UK Online Safety Act (2023), Australian Online Safety Act and the EU Digital Services Act. These regulations demand a high standard of compliance and an increased duty of care on companies such as TalkLife to address potentially harmful content. This comprehensive approach not only meets regulatory requirements but also fosters a supportive and safe online peer community.

 

Additional Considerations:

 

  • User Flagging
    Users are encouraged to report posts that do not adhere to these guidelines for review by the Safety Team.

  • Supportive Messaging
    We want to ensure that everyone in the community feels supported and safe therefore if a user's content is removed, they will receive a supportive message with encouragement to seek help and links to dedicate resources.

  • Resources and Helplines
    Users have a number of target resources and helplines for immediate support that can be accessed in the Wellbeing Centre and sidebar of the platform.

  • Routine Monitoring
    We would encourage users experiencing problematic eating difficulties to ensure they are having regular health check-ups with a general practitioner to monitor physical and mental health, especially for those with or at risk of eating disorders.

Reporting and Response:

We recognize that users may be facing personal struggles and need a platform to express themselves, particularly on sensitive issues like self-harm, suicide, and eating disorders. Therefore, it is important to clarify that users will not be banned or suspended simply for posting content around these topics. However, on rare occasions, a suspension may be issued if it is clear that a user persistently posts content against our guidelines, showing an awareness of these breaches. Our primary goal is to maintain a safe and supportive environment. If content is removed because it does not adhere to our community guidelines aimed at preventing harm, the affected users will receive a message explaining the reason for the removal.

 

Reporting and Response:

 

At TalkLife, safeguarding our community members is paramount. We are dedicated to fostering an environment where every user feels supported and respected. As part of our commitment to your wellbeing, we adhere to strict guidelines and policies to ensure the safety of all users.

 

While TalkLife serves as a platform for peer support and understanding, it's important to recognise that for some this is not a replacement for professional mental health services. We encourage individuals to seek professional help when needed and utilise the resources available in our 'I Need Help' sections for immediate support.

 

In compliance with the UK Online Safety Act (2023), international good practice guidelines and our internal policies, we actively moderate content to ensure it aligns with our community guidelines and does not pose harm to others. Your participation in reporting violations of these guidelines is vital. Our platform has a rigorous review process for each report, ensuring appropriate actions are taken, such as content removal or account modification, when necessary.

 

Additionally, we encourage community members to flag users or content that raises concerns about the safety of others. This option can be found under the 'Suicide, Self-Harm & Unhealthy Behaviours' category. Rest assured that your reports are confidential, and users are not informed that concerns have been raised about them.


Together, we can uphold the values of empathy, compassion, and respect that define TalkLife. By working collaboratively, we can continue to create a safe and supportive space where individuals can find solace, share experiences, and receive the support they deserve.

Thank you for being a valued member of our community.

bottom of page