/r/metaresearch
For those interested in research on research, evidence synthesis (systematic reviews, meta-analyses), improving the state of scientific research quality and evidence-based-anything.
Description
A mutually beneficial community of anyone working in or interested in the appraisal of research findings, sources of bias and the totality of evidence within medicine and all other disciplines. A place where we can all come together, share information we believe would be valuable to others and discuss. The community is young and thus not much of a resource at the moment, but it could quickly become one.
To get you started
1. Introduction. An excellent introduction to Reddit and how to use it here.
2. Reddit-speak. subreddit = sub = group (e.g. this subreddit is called metaresearch), mods = moderators, flairs = tags (e.g. in this subreddit we use: Research, Discussion, etc.).
3. Reditr. In the future you may want to use this to access Reddit on your Desktop for a cleaner interface.
Rules
1. Abide by rediquette. Any posts that do not, will be removed.
2. No spam or self-promotion. Any posts that the mods feel are primarily geared towards personal rather than community benefit will be removed.
Best practice guidelines
1. Use informative titles. Please do not use vague titles - make sure we know what we are getting into before clicking on a post. If posting a link to an article, try to include journal name abbreviation and date in parentheses at the end, e.g. "My post title (BMJ, 2018)". If posting a meeting, try to include dates and place in parentheses at the end, e.g. "My meeting post (21-22 May, 2018; Seattle)". Same logic applies to other types of posts.
2. Link to original article. If posting a news article referring to a published study, please also provide the link to the published study as a comment.
3. At least one comment for links. If posting a link, add a comment of at least one sentence about it. This is to help others appreciate what this link is about and start a conversation. To add a comment first post your link and then click on "comment" in the main page of the subreddit.
4. Anything metaresearch is of interest. You can post any of: (1) links to: a paper, news article, startup, interesting opportunity, other; (2) an opinion; (3) a conversation starter; (4) other.
5. If you like, upvote. If you really like a post, upvote it!
6. If you don't like, downvote. Help mods remove inappropriate or irrelevant posts by downvoting.
7. If you have an idea, let the mods know. Do you have feedback about this subreddit? Do you have any ideas? Hit the "message the moderators" to let the mods know.
8. Use flairs. You can give an appropriate tag to your post by clicking on "flair" underneath your post. If you'd really like to have a tag you cannot see, let the mods know. To search using a specific flair type for example: flair:"Research"
We need you
Do you have ideas about improving this sub? Are you an expert in CSS? Are you skilled with graphic designer software? If the answer to any of the above is "Yes" and you'd like to contribute, please contact the mods ;)
Other subreddits you might like
/r/metaresearch
Hi there,
I hope it's okay to ask about this in this subreddit.
I am currently working on a systematic review and will need to conduct a meta-analysis as well. I would like to know how to incorporate studies that have reported different statistics. For example, one study might have reported "mean" and "SD," while another might have reported "t-value" and "LCL & UCL."
I am aware that this can be done using software called "Comprehensive Meta-Analysis." Still, I have trouble finding out how to do it in R-studio. When I searched, the results were about conducting a meta-analysis with articles that measured the Y-variable differently instead of simply reporting different statistics.
I really appreciate any help you can provide.
Best regards.
I’ve heard a lot of discussion about retractions in connection with the replication/reproducibility crisis. A retraction almost always has something to do with the handling of data. Therefore, it almost always involves experimental work. Is there any situation where it would make sense to retract a theory paper? Is there any precedent for that? I am thinking of, for example, a situation where a mathematical derivation was found to be concretely incorrect or simply made up, or something along those lines.
Hi all,
We've recently launched Registered Reports Community Feedback - a site to better understand authors' and reviewers' experience of the Registered Reports peer review process:
https://registeredreports.cardiff.ac.uk
The broad goal is to collect data regarding how well various aspects of the Registered Reports process are implemented across academic journals.
This data will be aggregated and displayed publicly, showing how journals were rated across a range of categories by authors and reviewers.
We hope this will:
We want your feedback!
If you've been an author or reviewer of a Registered Report manuscript at Stage 1 and/or Stage 2, you can find our survey here (takes 5-10mins):
https://registeredreports.cardiff.ac.uk/feedback/feedback/selector.php
Don't forget to invite your co-authors - their feedback is important too!
So far, while the site has been in testing, users have given over 150 pieces of feedback across 34 journals.
You can view our dashboards, where journals are ranked by ratings, along with more detailed summaries for each journal:
https://registeredreports.cardiff.ac.uk/feedback/dashboards/
Along with contributing to community knowledge via the site, data will be used as part of my PhD on metascience.
Both the summary data and source code of the site will be released under open licences.
The team behind the site are: myself, Chris Chambers, and Loukia Tzavella (all at Cardiff University), with funding provided by Arnold Ventures.
Many thanks to all the beta-testers whose time and ideas have helped improve the site!
Any questions or comments, contact details here:
https://registeredreports.cardiff.ac.uk/feedback/contact/
Thanks!
If you want to win the The Nobel prize for science then just do this
Magister colin leslie dean has destroyed your biology with one sentence
you accept species
you accept species hybridization
thus
species hybridization contradicts the notion of species-thus making evolution ie evolving species nonsense
thus
If you want to win The Nobel prize for science be an Einstein and put the anomalies-hybridization's- into a new paradigm
a paradigm shift is required to take account of the fact that species and evolution are in fact nonsense
so what is a species
Scientific reality is textual
http://gamahucherpress.yellowgum.com/wp-content/uploads/Scientific-reality-is-textual.pdf
or
https://www.scribd.com/document/572639157/Scientific-Reality-is-Textual
just a definition
https://www.nationalgeographic.org/encyclopedia/species/
"A species is often defined as a group of organisms that can reproduce naturally with one another and create fertile offspring"
but
but species hybridization contradicts
that
https://kids.frontiersin.org/articles/10.3389/frym.2019.00113
"When organisms from two different species mix, or breed together, it is known as hybridization"
"Fertile hybrids create a very complex problem in science, because this breaks a rule from the Biological Species Concept"
so the definition of species is nonsense
note
when Biologist cant tell us what a species is -without contradiction thus evolution theory ie evolving species is nonsense
evolution is a myth
Hi all!
For the past few years, a small team of us here at System has been working to build a platform to organize the world’s data and knowledge in a whole new way.We just launched our public beta, and we’d love for you to check it out at System.com.
Our commitment to open data and open science is explicitly codified in our Public Benefit Charter. Like Wikipedia, the information on System is available under Creative Commons Attribution ShareAlike License, and topic definitions on System are sourced from Wikidata.
V1.0-beta of System is read-only, but soon, anyone will be able to contribute evidence of relationships. To become an early contributor of data or research to System (whether it’s research you’ve authored yourself, or published research that exists elsewhere), or just to be part of our growing community of systems thinkers, please come join us on Slack.
A few days ago, I discussed a project that I've been developing for assessing scientific predictive power. I've written a much more detailed explanation of the ideas behind it, and today I uploaded it to the physics preprint arXiv here:
Currently working on a scoping review protocol and wondering if anyone has experience publishing with JB Evidence Synthesis? Do they require authors to complete their training, or is it optional? Haven't found anything in the author guidelines, but I've heard informally they do require it. TIA
r/reproducibilitycrisis
Hi all,
I'm running my first methodological survey of sorts, and it's based on couple of simple searches in PubMed that included a time restriction for the past five years, up to July 31, 2021.
I ran both searches on August 2 and got 1,125 and 131 hits, respectively. Out of curiosity (and maybe some anxiety as this is my first time) I ran it again recently and got 1,122 and 132 instead.
I'm less concerned about the 1,122 because I'd rather have a bigger number that my team and I are currently looking through. But I was able to identify that one "missing record" of the 132, and it has a creation date of April 3 and was published in July issue of a journal.
Any more experienced searchers out there who could tell me what's happening here? Is this a common experience? More importantly, should I go ahead and add that 1 reference, or just accept that it didn't come up when the search was originally run?