/r/metaresearch

Photograph via snooOG

For those interested in research on research, evidence synthesis (systematic reviews, meta-analyses), improving the state of scientific research quality and evidence-based-anything.

Description

A mutually beneficial community of anyone working in or interested in the appraisal of research findings, sources of bias and the totality of evidence within medicine and all other disciplines. A place where we can all come together, share information we believe would be valuable to others and discuss. The community is young and thus not much of a resource at the moment, but it could quickly become one.

 

To get you started

1. Introduction. An excellent introduction to Reddit and how to use it here.

2. Reddit-speak. subreddit = sub = group (e.g. this subreddit is called metaresearch), mods = moderators, flairs = tags (e.g. in this subreddit we use: Research, Discussion, etc.).

3. Reditr. In the future you may want to use this to access Reddit on your Desktop for a cleaner interface.

 

Rules

1. Abide by rediquette. Any posts that do not, will be removed.

2. No spam or self-promotion. Any posts that the mods feel are primarily geared towards personal rather than community benefit will be removed.

 

Best practice guidelines

1. Use informative titles. Please do not use vague titles - make sure we know what we are getting into before clicking on a post. If posting a link to an article, try to include journal name abbreviation and date in parentheses at the end, e.g. "My post title (BMJ, 2018)". If posting a meeting, try to include dates and place in parentheses at the end, e.g. "My meeting post (21-22 May, 2018; Seattle)". Same logic applies to other types of posts.

2. Link to original article. If posting a news article referring to a published study, please also provide the link to the published study as a comment.

3. At least one comment for links. If posting a link, add a comment of at least one sentence about it. This is to help others appreciate what this link is about and start a conversation. To add a comment first post your link and then click on "comment" in the main page of the subreddit.

4. Anything metaresearch is of interest. You can post any of: (1) links to: a paper, news article, startup, interesting opportunity, other; (2) an opinion; (3) a conversation starter; (4) other.

5. If you like, upvote. If you really like a post, upvote it!

6. If you don't like, downvote. Help mods remove inappropriate or irrelevant posts by downvoting.

7. If you have an idea, let the mods know. Do you have feedback about this subreddit? Do you have any ideas? Hit the "message the moderators" to let the mods know.

8. Use flairs. You can give an appropriate tag to your post by clicking on "flair" underneath your post. If you'd really like to have a tag you cannot see, let the mods know. To search using a specific flair type for example: flair:"Research"

 

We need you

Do you have ideas about improving this sub? Are you an expert in CSS? Are you skilled with graphic designer software? If the answer to any of the above is "Yes" and you'd like to contribute, please contact the mods ;)

 

Other subreddits you might like

/r/openscience

/r/PhilosophyofScience

/r/BiologyPreprints

/r/BadPharma

/r/epidemiology

/r/medicine

/r/science

/r/metaresearch

614 Subscribers

1

“Be sustainable”: recommendations for implementation of #FAIR principles in life science data handling

0 Comments
2023/11/20
15:38 UTC

1

Where is it possible to find the list of all open source websites for medical research papers and preprints like https://www.biorxiv.org/

0 Comments
2023/03/24
10:28 UTC

0

Unlock an article for me pleasee!!!

0 Comments
2023/02/24
13:16 UTC

2

Registered Reports Community Feedback - seeking feedback from authors and reviewers

Hi all,

We've recently launched Registered Reports Community Feedback - a site to better understand authors' and reviewers' experience of the Registered Reports peer review process:

https://registeredreports.cardiff.ac.uk

The broad goal is to collect data regarding how well various aspects of the Registered Reports process are implemented across academic journals.

This data will be aggregated and displayed publicly, showing how journals were rated across a range of categories by authors and reviewers.

We hope this will:

  1. Help the community in choosing where to submit their Registered Report manuscripts
  2. Incentivise publishers to improve the Registered Reports process at their journals

We want your feedback!

If you've been an author or reviewer of a Registered Report manuscript at Stage 1 and/or Stage 2, you can find our survey here (takes 5-10mins):

https://registeredreports.cardiff.ac.uk/feedback/feedback/selector.php

Don't forget to invite your co-authors - their feedback is important too!

Screenshot of dashboard, showing aggregate ratings by authors and reviewers of their experience of the Registered Reports peer review process

So far, while the site has been in testing, users have given over 150 pieces of feedback across 34 journals.

You can view our dashboards, where journals are ranked by ratings, along with more detailed summaries for each journal:

https://registeredreports.cardiff.ac.uk/feedback/dashboards/

Along with contributing to community knowledge via the site, data will be used as part of my PhD on metascience.

Both the summary data and source code of the site will be released under open licences.

The team behind the site are: myself, Chris Chambers, and Loukia Tzavella (all at Cardiff University), with funding provided by Arnold Ventures.

Many thanks to all the beta-testers whose time and ideas have helped improve the site!

Any questions or comments, contact details here:

https://registeredreports.cardiff.ac.uk/feedback/contact/

Thanks!

0 Comments
2023/01/21
15:37 UTC

1

The Nature and Nuturing of Research: A Modern Synthesis

0 Comments
2022/08/25
18:23 UTC

0

The Nobel prize for science

If you want to win the The Nobel prize for science then just do this

Magister colin leslie dean has destroyed your biology with one sentence

you accept species

you accept species hybridization

thus

species hybridization contradicts the notion of species-thus making evolution ie evolving species nonsense

thus

If you want to win The Nobel prize for science be an Einstein and put the anomalies-hybridization's- into a new paradigm

a paradigm shift is required to take account of the fact that species and evolution are in fact nonsense

so what is a species

Scientific reality is textual

http://gamahucherpress.yellowgum.com/wp-content/uploads/Scientific-reality-is-textual.pdf

or

https://www.scribd.com/document/572639157/Scientific-Reality-is-Textual

just a definition

https://www.nationalgeographic.org/encyclopedia/species/

"A species is often defined as a group of organisms that can reproduce naturally with one another and create fertile offspring"

but

but species hybridization contradicts

that

https://kids.frontiersin.org/articles/10.3389/frym.2019.00113

"When organisms from two different species mix, or breed together, it is known as hybridization"

"Fertile hybrids create a very complex problem in science, because this breaks a rule from the Biological Species Concept"

so the definition of species is nonsense

note

when Biologist cant tell us what a species is -without contradiction thus evolution theory ie evolving species is nonsense

evolution is a myth

2 Comments
2022/08/13
07:02 UTC

3

Elicit.org AI-based Metasearch Engine finds scientific articles and related research

1 Comment
2022/08/06
20:48 UTC

5

System - a platform creating meta-analyses through research aggregation and to promote systemic thinking

Hi all!
For the past few years, a small team of us here at System has been working to build a platform to organize the world’s data and knowledge in a whole new way.We just launched our public beta, and we’d love for you to check it out at System.com.

Our commitment to open data and open science is explicitly codified in our Public Benefit Charter. Like Wikipedia, the information on System is available under Creative Commons Attribution ShareAlike License, and topic definitions on System are sourced from Wikidata.

V1.0-beta of System is read-only, but soon, anyone will be able to contribute evidence of relationships. To become an early contributor of data or research to System (whether it’s research you’ve authored yourself, or published research that exists elsewhere), or just to be part of our growing community of systems thinkers, please come join us on Slack.

0 Comments
2022/06/17
16:44 UTC

1

A scientific prediction project

A few days ago, I discussed a project that I've been developing for assessing scientific predictive power. I've written a much more detailed explanation of the ideas behind it, and today I uploaded it to the physics preprint arXiv here:

Assessing scientific predictive power

4 Comments
2022/05/11
01:08 UTC

2

My scientific prediction project

2 Comments
2022/05/06
17:43 UTC

1

Publishing with JB Evidence Synthesis

Currently working on a scoping review protocol and wondering if anyone has experience publishing with JB Evidence Synthesis? Do they require authors to complete their training, or is it optional? Haven't found anything in the author guidelines, but I've heard informally they do require it. TIA

3 Comments
2022/04/11
16:45 UTC

1

Question about best practice when pre-registering analysis of existing data

0 Comments
2022/04/04
14:29 UTC

2

I started a sub on the reproducibility crisis in science. Please check it out. Thanks. r/reproducibilitycrisis (link is posted inside).

r/reproducibilitycrisis

0 Comments
2021/12/02
22:50 UTC

3

PubMed: changing # of records even when publication date restricted?

Hi all,

I'm running my first methodological survey of sorts, and it's based on couple of simple searches in PubMed that included a time restriction for the past five years, up to July 31, 2021.

I ran both searches on August 2 and got 1,125 and 131 hits, respectively. Out of curiosity (and maybe some anxiety as this is my first time) I ran it again recently and got 1,122 and 132 instead.

I'm less concerned about the 1,122 because I'd rather have a bigger number that my team and I are currently looking through. But I was able to identify that one "missing record" of the 132, and it has a creation date of April 3 and was published in July issue of a journal.

Any more experienced searchers out there who could tell me what's happening here? Is this a common experience? More importantly, should I go ahead and add that 1 reference, or just accept that it didn't come up when the search was originally run?

2 Comments
2021/09/02
03:03 UTC

2

User Guide of OJS 3.x Version

0 Comments
2021/08/31
19:58 UTC

7

What would it take to build a platform for discovering new cross-disciplinary insights, rapidly innovating, and fostering collective intelligence? Joel Chan weighs in.

0 Comments
2021/08/27
15:45 UTC

Back To Top