/r/redditdev
A subreddit for discussion of Reddit's API and Reddit API clients.
A subreddit for discussion of Reddit's API and Reddit API clients.
Please confine discussion to Reddit's API instead of using this as a soapbox to talk to the admins. In particular, use /r/ideasfortheadmins for feature ideas and /r/bugs for bugs. If you have general reddit questions, try /r/help.
To see an explanation of recent user-facing changes to reddit (and the code behind them), check out /r/changelog.
To report a security issue with reddit, please send an email to whitehats@reddit.com .
This is an admin-sponsored subreddit.
/r/redditdev
Hi, I am trying to scrape posts from a specific subreddit for the past 10 years. So, I am using PRAW and doing something like
for submission in reddit.subreddit(subreddit_name).new(limit=None):
But this only returns me the most recent 800+ posts and it stops. I think this might be because of a limit or pagination issue, so I try something that I find on the web:
submissions = reddit.subreddit(subreddit_name).new(limit=500, params={'before': last_submission_id})
where I perform custom pagination. This doesn't work at all!
May I get suggestion on what other API/tools to try, where to look for relevant documentation, or what is wrong with my syntax! Thanks
P/S: I don't have access to Pushshift as I am not a mod of the subreddit.
Hi, so i made a basic reddit bot which answers when it's mentioned.
While running normally, nothing happens, but when someone mentions it and it tries to reply, it gets instantly suspended.
Below is the output of the last three iterations of the loop. It looks like I'm being given 1000 requests, then being stopped. I'm logged in and print(reddit.user.me())
prints my username. From what I read, if I'm logged in then PRAW is supposed to do whatever it needs to do to avoid the rate limiting for me, so why is this happening?
competitiveedh
Fetching: GET https://oauth.reddit.com/r/competitiveedh/about/ at 1730683196.4189775
Data: None
Params: {'raw_json': 1}
Response: 200 (3442 bytes) (rst-3:rem-4.0:used-996 ratelimit) at 1730683196.56501
cEDH
Fetching: GET https://oauth.reddit.com/r/competitiveedh/hot at 1730683196.5660112
Data: None
Params: {'limit': 2, 'raw_json': 1}
Sleeping: 0.60 seconds prior to call
Response: 200 (3727 bytes) (rst-2:rem-3.0:used-997 ratelimit) at 1730683197.4732685
trucksim
Fetching: GET https://oauth.reddit.com/r/trucksim/about/ at 1730683197.4742687
Data: None
Params: {'raw_json': 1}
Sleeping: 0.20 seconds prior to call
Response: 200 (2517 bytes) (rst-2:rem-2.0:used-998 ratelimit) at 1730683197.887361
TruckSim
Fetching: GET https://oauth.reddit.com/r/trucksim/hot at 1730683197.8883615
Data: None
Params: {'limit': 2, 'raw_json': 1}
Sleeping: 0.80 seconds prior to call
Response: 200 (4683 bytes) (rst-1:rem-1.0:used-999 ratelimit) at 1730683198.929595
battletech
Fetching: GET https://oauth.reddit.com/r/battletech/about/ at 1730683198.9305944
Data: None
Params: {'raw_json': 1}
Sleeping: 0.40 seconds prior to call
Response: 200 (3288 bytes) (rst-0:rem-0.0:used-1000 ratelimit) at 1730683199.5147257
Home of the BattleTech fan community
Fetching: GET https://oauth.reddit.com/r/battletech/hot at 1730683199.5157266
Data: None
Params: {'limit': 2, 'raw_json': 1}
Response: 429 (0 bytes) (rst-0:rem-0.0:used-1000 ratelimit) at 1730683199.5897427
Traceback (most recent call last):
This is where I received 429 HTTP response.
Hey, im working on a school project and I need to complete a jpy notebook with some reddit data scraping but I'm having some problems with the login for no apparent reasons. I created an app and it's a "web" app so it should be good with only client_id and client_secret(I tried with password and username too, same result).
This is the code I'm using to authenticate:
```
reddit_id = os.getenv("REDDIT_ID")
reddit_secret = os.getenv("REDDIT_SECRET")
user_agent = f"script:my_reddit_app:v1.0 (by u/{reddit_id})"
reddit = praw.Reddit(
client_id=reddit_id,
client_secret=reddit_secret,
#username=os.getenv("REDDIT_USERNAME"),password=os.getenv("REDDIT_PASSWORD"),
user_agent=user_agent,
)
print(reddit.user.me())
```
The app has as redirect URL `https://www.reddit.com/prefs/apps/` but I tried with different ones and that doesn't seem to be the problem cause in theory it never gets accessed cause I'm in readonly mode.
This is the traceback I get:
(saving u sometime: yes, all the creds are correct, yes the app is correctly created, yes I already looked at the manual and all possible links on the internet, No i dont have 2FA on.
Even if I try to visit the .../v1/token from browser and I insert correct username and password I keep getting redirected to the same /v1/token page asking for password and username)
```
DEBUG:prawcore:Fetching: GET at 1730632032.657062
DEBUG:prawcore:Data: None
DEBUG:prawcore:Params: {'raw_json': 1}
https://oauth.reddit.com/api/v1/me
praw version == 7.8.1
---------------------------------------------------------------------------
RequestException Traceback (most recent call last)
/tmp/ipykernel_68679/297234463.py in <module>
26 )
27
---> 28 print(reddit.user.me())
29
30 #print(f"REDDIT_ID: {reddit_id}")
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/models/user.py in me(self, use_cache)
168 raise ReadOnlyException(msg)
169 if "_me" not in self.__dict__ or not use_cache:
--> 170 user_data = self._reddit.get(API_PATH["me"])
171 self._me = Redditor(self._reddit, _data=user_data)
172 return self._me
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in get(self, path, params)
729
730 """
--> 731 return self._objectify_request(method="GET", params=params, path=path)
732
733 @_deprecate_args("fullnames", "url", "subreddits")
~/.local/lib/python3.10/site-packages/praw/reddit.py in _objectify_request(self, data, files, json, method, params, path)
512 """
513 return self._objector.objectify(
--> 514 self.request(
515 data=data,
516 files=files,
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in request(self, data, files, json, method, params, path)
961 raise ClientException(msg)
962 try:
--> 963 return self._core.request(
964 data=data,
965 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in request(self, method, path, data, files, json, params, timeout)
326 json["api_type"] = "json"
327 url = urljoin(self._requestor.oauth_url, path)
--> 328 return self._request_with_retries(
329 data=data,
330 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _request_with_retries(self, data, files, json, method, params, timeout, url, retry_strategy_state)
232 retry_strategy_state.sleep()
233 self._log_request(data, method, params, url)
--> 234 response, saved_exception = self._make_request(
235 data,
236 files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _make_request(self, data, files, json, method, params, retry_strategy_state, timeout, url)
184 ) -> tuple[Response, None] | tuple[None, Exception]:
185 try:
--> 186 response = self._rate_limiter.call(
187 self._requestor.request,
188 self._set_header_callback,
~/.local/lib/python3.10/site-packages/prawcore/rate_limit.py in call(self, request_function, set_header_callback, *args, **kwargs)
44 """
45 self.delay()
---> 46 kwargs["headers"] = set_header_callback()
47 response = request_function(*args, **kwargs)
48 self.update(response.headers)
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _set_header_callback(self)
280 def _set_header_callback(self) -> dict[str, str]:
281 if not self._authorizer.is_valid() and hasattr(self._authorizer, "refresh"):
--> 282 self._authorizer.refresh()
283 return {"Authorization": f"bearer {self._authorizer.access_token}"}
284
~/.local/lib/python3.10/site-packages/prawcore/auth.py in refresh(self)
423 if two_factor_code:
424 additional_kwargs["otp"] = two_factor_code
--> 425 self._request_token(
426 grant_type="password",
427 username=self._username,
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _request_token(self, **data)
153 url = self._authenticator._requestor.reddit_url + const.ACCESS_TOKEN_PATH
154 pre_request_time = time.time()
--> 155 response = self._authenticator._post(url=url, **data)
156 payload = response.json()
157 if "error" in payload: # Why are these OKAY responses?
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _post(self, url, success_status, **data)
49 self, url: str, success_status: int = codes["ok"], **data: Any
50 ) -> Response:
---> 51 response = self._requestor.request(
52 "post",
53 url,
~/.local/lib/python3.10/site-packages/prawcore/requestor.py in request(self, timeout, *args, **kwargs)
68 return self._http.request(*args, timeout=timeout or self.timeout, **kwargs)
69 except Exception as exc: # noqa: BLE001
---> 70 raise RequestException(exc, args, kwargs) from None
RequestException: error with request Failed to parse:
DEBUG:prawcore:Fetching: GET https://oauth.reddit.com/api/v1/me at 1730632032.657062
DEBUG:prawcore:Data: None
DEBUG:prawcore:Params: {'raw_json': 1}
praw version == 7.8.1
---------------------------------------------------------------------------
RequestException Traceback (most recent call last)
/tmp/ipykernel_68679/297234463.py in <module>
26 )
27
---> 28 print(reddit.user.me())
29
30 #print(f"REDDIT_ID: {reddit_id}")
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/models/user.py in me(self, use_cache)
168 raise ReadOnlyException(msg)
169 if "_me" not in self.__dict__ or not use_cache:
--> 170 user_data = self._reddit.get(API_PATH["me"])
171 self._me = Redditor(self._reddit, _data=user_data)
172 return self._me
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in get(self, path, params)
729
730 """
--> 731 return self._objectify_request(method="GET", params=params, path=path)
732
733 @_deprecate_args("fullnames", "url", "subreddits")
~/.local/lib/python3.10/site-packages/praw/reddit.py in _objectify_request(self, data, files, json, method, params, path)
512 """
513 return self._objector.objectify(
--> 514 self.request(
515 data=data,
516 files=files,
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in request(self, data, files, json, method, params, path)
961 raise ClientException(msg)
962 try:
--> 963 return self._core.request(
964 data=data,
965 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in request(self, method, path, data, files, json, params, timeout)
326 json["api_type"] = "json"
327 url = urljoin(self._requestor.oauth_url, path)
--> 328 return self._request_with_retries(
329 data=data,
330 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _request_with_retries(self, data, files, json, method, params, timeout, url, retry_strategy_state)
232 retry_strategy_state.sleep()
233 self._log_request(data, method, params, url)
--> 234 response, saved_exception = self._make_request(
235 data,
236 files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _make_request(self, data, files, json, method, params, retry_strategy_state, timeout, url)
184 ) -> tuple[Response, None] | tuple[None, Exception]:
185 try:
--> 186 response = self._rate_limiter.call(
187 self._requestor.request,
188 self._set_header_callback,
~/.local/lib/python3.10/site-packages/prawcore/rate_limit.py in call(self, request_function, set_header_callback, *args, **kwargs)
44 """
45 self.delay()
---> 46 kwargs["headers"] = set_header_callback()
47 response = request_function(*args, **kwargs)
48 self.update(response.headers)
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _set_header_callback(self)
280 def _set_header_callback(self) -> dict[str, str]:
281 if not self._authorizer.is_valid() and hasattr(self._authorizer, "refresh"):
--> 282 self._authorizer.refresh()
283 return {"Authorization": f"bearer {self._authorizer.access_token}"}
284
~/.local/lib/python3.10/site-packages/prawcore/auth.py in refresh(self)
423 if two_factor_code:
424 additional_kwargs["otp"] = two_factor_code
--> 425 self._request_token(
426 grant_type="password",
427 username=self._username,
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _request_token(self, **data)
153 url = self._authenticator._requestor.reddit_url + const.ACCESS_TOKEN_PATH
154 pre_request_time = time.time()
--> 155 response = self._authenticator._post(url=url, **data)
156 payload = response.json()
157 if "error" in payload: # Why are these OKAY responses?
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _post(self, url, success_status, **data)
49 self, url: str, success_status: int = codes["ok"], **data: Any
50 ) -> Response:
---> 51 response = self._requestor.request(
52 "post",
53 url,
~/.local/lib/python3.10/site-packages/prawcore/requestor.py in request(self, timeout, *args, **kwargs)
68 return self._http.request(*args, timeout=timeout or self.timeout, **kwargs)
69 except Exception as exc: # noqa: BLE001
---> 70 raise RequestException(exc, args, kwargs) from None
RequestException: error with request Failed to parse: https://www.reddit.com/api/v1/access_token
https://www.reddit.com/api/v1/access_token
```
Thanks!Hey, im working on a school project and I need to complete a jpy notebook with some reddit data scraping but I'm having some problems with the login for no apparent reasons. I created an app and it's a "web" app so it should be good with only client_id and client_secret(I tried with password and username too, same result).
This is the code I'm using to authenticate:
```
reddit_id = os.getenv("REDDIT_ID")
reddit_secret = os.getenv("REDDIT_SECRET")
user_agent = f"script:my_reddit_app:v1.0 (by u/{reddit_id})"
reddit = praw.Reddit(
client_id=reddit_id,
client_secret=reddit_secret,
#username=os.getenv("REDDIT_USERNAME"),password=os.getenv("REDDIT_PASSWORD"),
user_agent=user_agent,
)
print(reddit.user.me())
```
The app has as redirect URL `https://www.reddit.com/prefs/apps/` but I tried with different ones and that doesn't seem to be the problem cause in theory it never gets accessed cause I'm in readonly mode.
This is the traceback I get:
(saving u sometime: yes, all the creds are correct, yes the app is correctly created, yes I already looked at the manual and all possible links on the internet.
Even if I try to visit the .../v1/token from browser and I insert correct username and password I keep getting redirected to the same /v1/token page asking for password and username)
```
DEBUG:prawcore:Fetching: GET at 1730632032.657062
DEBUG:prawcore:Data: None
DEBUG:prawcore:Params: {'raw_json': 1}
https://oauth.reddit.com/api/v1/me
praw version == 7.8.1
---------------------------------------------------------------------------
RequestException Traceback (most recent call last)
/tmp/ipykernel_68679/297234463.py in <module>
26 )
27
---> 28 print(reddit.user.me())
29
30 #print(f"REDDIT_ID: {reddit_id}")
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/models/user.py in me(self, use_cache)
168 raise ReadOnlyException(msg)
169 if "_me" not in self.__dict__ or not use_cache:
--> 170 user_data = self._reddit.get(API_PATH["me"])
171 self._me = Redditor(self._reddit, _data=user_data)
172 return self._me
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in get(self, path, params)
729
730 """
--> 731 return self._objectify_request(method="GET", params=params, path=path)
732
733 @_deprecate_args("fullnames", "url", "subreddits")
~/.local/lib/python3.10/site-packages/praw/reddit.py in _objectify_request(self, data, files, json, method, params, path)
512 """
513 return self._objector.objectify(
--> 514 self.request(
515 data=data,
516 files=files,
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in request(self, data, files, json, method, params, path)
961 raise ClientException(msg)
962 try:
--> 963 return self._core.request(
964 data=data,
965 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in request(self, method, path, data, files, json, params, timeout)
326 json["api_type"] = "json"
327 url = urljoin(self._requestor.oauth_url, path)
--> 328 return self._request_with_retries(
329 data=data,
330 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _request_with_retries(self, data, files, json, method, params, timeout, url, retry_strategy_state)
232 retry_strategy_state.sleep()
233 self._log_request(data, method, params, url)
--> 234 response, saved_exception = self._make_request(
235 data,
236 files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _make_request(self, data, files, json, method, params, retry_strategy_state, timeout, url)
184 ) -> tuple[Response, None] | tuple[None, Exception]:
185 try:
--> 186 response = self._rate_limiter.call(
187 self._requestor.request,
188 self._set_header_callback,
~/.local/lib/python3.10/site-packages/prawcore/rate_limit.py in call(self, request_function, set_header_callback, *args, **kwargs)
44 """
45 self.delay()
---> 46 kwargs["headers"] = set_header_callback()
47 response = request_function(*args, **kwargs)
48 self.update(response.headers)
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _set_header_callback(self)
280 def _set_header_callback(self) -> dict[str, str]:
281 if not self._authorizer.is_valid() and hasattr(self._authorizer, "refresh"):
--> 282 self._authorizer.refresh()
283 return {"Authorization": f"bearer {self._authorizer.access_token}"}
284
~/.local/lib/python3.10/site-packages/prawcore/auth.py in refresh(self)
423 if two_factor_code:
424 additional_kwargs["otp"] = two_factor_code
--> 425 self._request_token(
426 grant_type="password",
427 username=self._username,
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _request_token(self, **data)
153 url = self._authenticator._requestor.reddit_url + const.ACCESS_TOKEN_PATH
154 pre_request_time = time.time()
--> 155 response = self._authenticator._post(url=url, **data)
156 payload = response.json()
157 if "error" in payload: # Why are these OKAY responses?
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _post(self, url, success_status, **data)
49 self, url: str, success_status: int = codes["ok"], **data: Any
50 ) -> Response:
---> 51 response = self._requestor.request(
52 "post",
53 url,
~/.local/lib/python3.10/site-packages/prawcore/requestor.py in request(self, timeout, *args, **kwargs)
68 return self._http.request(*args, timeout=timeout or self.timeout, **kwargs)
69 except Exception as exc: # noqa: BLE001
---> 70 raise RequestException(exc, args, kwargs) from None
RequestException: error with request Failed to parse:
DEBUG:prawcore:Fetching: GET https://oauth.reddit.com/api/v1/me at 1730632032.657062
DEBUG:prawcore:Data: None
DEBUG:prawcore:Params: {'raw_json': 1}
praw version == 7.8.1
---------------------------------------------------------------------------
RequestException Traceback (most recent call last)
/tmp/ipykernel_68679/297234463.py in <module>
26 )
27
---> 28 print(reddit.user.me())
29
30 #print(f"REDDIT_ID: {reddit_id}")
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/models/user.py in me(self, use_cache)
168 raise ReadOnlyException(msg)
169 if "_me" not in self.__dict__ or not use_cache:
--> 170 user_data = self._reddit.get(API_PATH["me"])
171 self._me = Redditor(self._reddit, _data=user_data)
172 return self._me
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in get(self, path, params)
729
730 """
--> 731 return self._objectify_request(method="GET", params=params, path=path)
732
733 @_deprecate_args("fullnames", "url", "subreddits")
~/.local/lib/python3.10/site-packages/praw/reddit.py in _objectify_request(self, data, files, json, method, params, path)
512 """
513 return self._objector.objectify(
--> 514 self.request(
515 data=data,
516 files=files,
~/.local/lib/python3.10/site-packages/praw/util/deprecate_args.py in wrapped(*args, **kwargs)
44 stacklevel=2,
45 )
---> 46 return func(**dict(zip(_old_args, args)), **kwargs)
47
48 return wrapped
~/.local/lib/python3.10/site-packages/praw/reddit.py in request(self, data, files, json, method, params, path)
961 raise ClientException(msg)
962 try:
--> 963 return self._core.request(
964 data=data,
965 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in request(self, method, path, data, files, json, params, timeout)
326 json["api_type"] = "json"
327 url = urljoin(self._requestor.oauth_url, path)
--> 328 return self._request_with_retries(
329 data=data,
330 files=files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _request_with_retries(self, data, files, json, method, params, timeout, url, retry_strategy_state)
232 retry_strategy_state.sleep()
233 self._log_request(data, method, params, url)
--> 234 response, saved_exception = self._make_request(
235 data,
236 files,
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _make_request(self, data, files, json, method, params, retry_strategy_state, timeout, url)
184 ) -> tuple[Response, None] | tuple[None, Exception]:
185 try:
--> 186 response = self._rate_limiter.call(
187 self._requestor.request,
188 self._set_header_callback,
~/.local/lib/python3.10/site-packages/prawcore/rate_limit.py in call(self, request_function, set_header_callback, *args, **kwargs)
44 """
45 self.delay()
---> 46 kwargs["headers"] = set_header_callback()
47 response = request_function(*args, **kwargs)
48 self.update(response.headers)
~/.local/lib/python3.10/site-packages/prawcore/sessions.py in _set_header_callback(self)
280 def _set_header_callback(self) -> dict[str, str]:
281 if not self._authorizer.is_valid() and hasattr(self._authorizer, "refresh"):
--> 282 self._authorizer.refresh()
283 return {"Authorization": f"bearer {self._authorizer.access_token}"}
284
~/.local/lib/python3.10/site-packages/prawcore/auth.py in refresh(self)
423 if two_factor_code:
424 additional_kwargs["otp"] = two_factor_code
--> 425 self._request_token(
426 grant_type="password",
427 username=self._username,
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _request_token(self, **data)
153 url = self._authenticator._requestor.reddit_url + const.ACCESS_TOKEN_PATH
154 pre_request_time = time.time()
--> 155 response = self._authenticator._post(url=url, **data)
156 payload = response.json()
157 if "error" in payload: # Why are these OKAY responses?
~/.local/lib/python3.10/site-packages/prawcore/auth.py in _post(self, url, success_status, **data)
49 self, url: str, success_status: int = codes["ok"], **data: Any
50 ) -> Response:
---> 51 response = self._requestor.request(
52 "post",
53 url,
~/.local/lib/python3.10/site-packages/prawcore/requestor.py in request(self, timeout, *args, **kwargs)
68 return self._http.request(*args, timeout=timeout or self.timeout, **kwargs)
69 except Exception as exc: # noqa: BLE001
---> 70 raise RequestException(exc, args, kwargs) from None
RequestException: error with request Failed to parse: https://www.reddit.com/api/v1/access_token
https://www.reddit.com/api/v1/access_token
```
Thanks!
I want to use services like NewsWhip, Brand24 and Segue but I can’t figure out how these services comply with Reddit’s dev terms or usage policy. Can anyone explain how this would be compliant, or do they all have a commercial license with Reddit?
Basically want to see if any Reddit Engineers open to chatting about what it’s like working on Reddit’s source code. Saw that their SWE internships just opened and wanted to see if it was worth applying.
Will reddit get mad if an oauth api app re-posts the same content to multiple subscribed r/. would this get my app suspended?
We built a super simple example / test app and have uploaded it. However, we can't seem to get our custom post type to show up in our test subreddit.
Besides being on a whitelist, are we doing anything else wrong?
This is the main.tsx:
import { Devvit, JSONObject } from '@devvit/public-api';
Devvit.addCustomPostType({
name: 'Bonsai',
//height: 'regular',
render: (context) => {
const { useState } = context;
const [myState, setMyState] = useState({});
const handleMessage = (ev: JSONObject) => {
console.log(ev);
console.log('Hello Bonsai!');
};
return (
<>
<vstack height="100%" width="100%" gap="medium" alignment="center middle">
<text>Hello Bonsai!</text>
</vstack>
</>
);
},
});
If I create a private subreddit, is it possible to handle the approved user list with the API? What endpoints can I use?
When I try api/compose and use my personal account to send messages to my friends, I always get this error. Has anyone encountered the same situation? What is the reason or how to solve it?
I am trying to run some code and keep running into the problem of the computer not liking "praw core". I can see it in my pip list and have gotten the computer to tell me that I have downloaded it but when I go to run python main.py it tells me "module not found error: no module named "praw core" what should I do
What is the difference between these two? I want to create a reddit app that a user can log into and perform actions on the api. However i haven't decided if I want a mobile version or web application yet (or maybe both eventually). I want to just create a backend service first then think about the GUI later. Is this possible? Which one would be more appropriate?
Hi everyone,
So a user of my product noticed they could not post in this sub: https://www.reddit.com/r/TechHelping/
the new post throws a 403, and when looking at the website, this is because there is a request permission to post?
I've never seen this before, so how does this translate into the api and such?
It is possible to fetch subreddit data from API without authentication. You just need to send get request to subreddit url + ".json" (https://www.reddit.com/r/redditdev.json), from anywhere you want.
I want to make app which uses this API. It will display statistics for subreddits (number of users, number of comments, number of votes etc.).
Am I allowed to build web app which uses data acquired this way? Reddit terms are not very clear on this.
Thank you in advance :)
I'm building a cross-posting app. When posting to Reddit, some subreddits require flairs. I need to fetch available flairs when a user selects a subreddit and then send the flair in the post.
const response = await fetch( `https://oauth.reddit.com/r/${subreddit}/api/link_flair_v2`, {
headers: {
Authorization: `Bearer ${accessToken}`,
"User-Agent": "X/1.0.0",
},
});
Getting 403 Forbidden. According to docs:
/api/link_flair
or r/subreddit/api/link_flair_v2
How can I properly fetch available flairs for a given subreddit? Has anyone implemented this successfully?
It seems that the maximum number of submissions I can fetch is 1000:
limit
– The number of content entries to fetch. If limit isNone
, then fetch as many entries as possible. Most of Reddit’s listings contain a maximum of 1000 items, and are returned 100 at a time. This class will automatically issue all necessary requests (default: 100).
Can anyone shed some more light on this limit? What happens with None? If I'm using .new(limit=None)
how many submissions am I actually getting at most? Also; how many API requests am I making? Just whatever number I type in divided by 100?
Use case: I want the URLs of as many submissions as possible. These URLs are then passed through random.choice(URLs)
to get a singular random submission link from the subreddit.
Actual code. Get submission titles (image submissions):
def get_image_links(reddit: praw.Reddit) -> list:
sub = reddit.subreddit('example')
image_candidates = []
for image_submission in sub.new(limit=None):
if (re.search('(i.redd.it|i.imgur.com)', image_submission.url):
image_candidates.append(image_submissions.url)
return image_candidates
These image links are then saved to a variable which is then later passed onto the function that generates the bot's actual functionality (a comment reply):
def generate_reply_text(image_links: list) -> str:
...
bot_reply_text += f'''[{link_text}]({random.choice(image_links)})'''
...
I noticed over the last couple hours some extreme latency when my bot is downloading images. It's also noticeable when browsing Reddit on my phone (while on my wifi). It's the 2nd time in the last 2 weeks I've seen something similar happen.
Status page is green and it's the only domain impacted so I suspect it's some type of throttling being tested.
No changes on my end. The bot is doing the same thing it's done for years.
I'm not sure but it seems that all the communities I fetch through the /subreddits/ API come with the "over18" property set to false. Has this property been discontinued?
How many API requests does it take to cause rate-limiting of an authenticated snoowrap client? Is that number different between reads and writes?
I would guess it changes as Reddit tightens its reins but of course would be helpful of anyone has the current max values in order to effectively debounce/delay requests.
Hi folks,
I'm new to pulling data from APIs and would like some feedback to tell me where i'm going wrong. I've set up a new subreddit and my goal is to pull data about it into a google sheet to help me manage the sub.
So far:
I created an app using the (https://old.reddit.com/prefs/apps/) pathway
i sent a message through to reddit asking for permission to use the API and was granted permission a few days back
I've set up a google app script with the help of chatgpt which pulls the data of posts in the sub
however i keep getting an error message related to the authentication process: Error: Exception: Request failed for https://oauth.reddit.com returned code 403. Truncated server response:......
Can anyone give me some advice on solving the issue, particularly the 0Auth 2 issue. Or if you there's something else that could be the issue with the setup.
I realise this may be an issue which requires more info to help problem solve and i'd be happy to share more info!
Thanks in advance guys
Thanks for your attention I wanted a bot that could help us getting more subscribbers but that still follow Reddit Guideline
I would also like to pay for one if its not already existing
I have tried and tried to get this to work, but it is just a nightmare. I'm wondering if anyone has already done this and has a solution that I can use.
async function galleryHandling(post) {
const imageUrls = [];
for (const item of post.gallery_data.items) {
const mediaId = item.media_id;
const extensions = ['jpg', 'jpeg', 'png', 'gif', 'tif', 'tiff', 'bmp', 'webp', 'svg', 'ico'];
for (const ext of extensions) {
const url = `https://i.red.it/${mediaId}.${ext}`;
const statusCode = await checkUrl(url);
console.log(statusCode);
if (statusCode === 200) {
console.log(`GALLERY: ${ext.toUpperCase()} FILE`);
imageUrls.push(url);
break;
}
}
}
return imageUrls;
}
async function singleHandling(post) {
if (post.url && (post.url.endsWith('.jpg') || post.url.endsWith('.png') || post.url.endsWith('.gif') || post.url.endsWith('.jpeg'))) {
return post.url;
}
console.log(`SINGLE HANDLING NOT ENDING IN JPG, PNG, GIF, JPEG | TITLE: ${post.title} | URL: ${post.url}`);
}
async function runBot(reddit) {
for (const subredditName of Subreddits) {
const subreddit = await reddit.getSubreddit(subredditName);
const posts = await subreddit.getNew({ limit : 100 });
for (const post of posts) {
imageUrls = [];
if (post.is_gallery) {
imageUrls = await galleryHandling(post);
}else {
imageUrls.push(await singleHandling(post));
}
console.log("------------- NEW LINE -------------")
imageUrls.forEach(url => {
console.log(`URL: ${url}`)
});
}
new Promise(resolve => setTimeout(resolve, 1000));
}
}
Sometimes my singleHandling() function will fail and the results are https://www.reddit.com/gallery/-----.
this account 65436563465
shows normal/active under old.reddit, suspended under sh.reddit and just a blank page under new.reddit
i don't know how the app displays it
using the api/praw, it looks normal/active.
is there an api/praw method to determine the status of accounts like this?
Why there is many communities being returned by the API that has this format of name "r/a:t5_cmjdz", which consist of "r/a:<subreddit_id>"?
Really don’t want to maintain a python environment in my otherwise purely typescript app. Anyone out there building the PRAW equivalent for nodejs? Jraw and everything else all seem dated well-beyond the recent Reddit API crackdown.
It posts a random pic from 20 pics to choose from and a random title and adds flair, posts ever 2 hours. Now it worked fine foe the first post but then When I go into my account the next day I see that all the posts are greyed out. Like when the upvote and downvote button are greyed out meaning the posts are somehow getting removed.
Why i this?
Is there any good way to export comments from a single post in reddit? I tried adding ".json" to the end of link in the address bar, but it is limited to around 20 comments I think, so less usable. It would be good if there is a trick or even something to do in ubuntu cli and etc
Documentation says that a user-agent header must look like this
```
<platform>:<app ID>:<version string> (by /u/<reddit username>)
```
But there is zero information about platform, version string, reddit username.
I spent one day to just login and fetch `/api/me`. The documentation is the worst I've ever seen. Am I stupid or the doc is really "not good"?
Hi there,
I have downloaded about 3 Windows 11 Reddit clients (Bacconit, Reddert, etc.) this evening. They all have the same overall look about them, and the same thing happens when I try to log in to my account on these clients (I'm using the reddit website currently). When I click Login, a blank white window appears that says "Connecting to a service" on the top display bar. A place for me to type my username and password never does appear, so I end up closing the app and trying another one. This same thing has happened to me now 3 times. I always make sure I'm logged out of my reddit account on the web before I try logging in to my account on a Windows 11 client. Any ideas what's happening?
Thanks in advance,
th3lung
Dear Reddit Development Team
Hi! I have chosen Reddit as a project for my research course. My goal is to gather general information about the website's architecture and the technology stack being used.
Could you please assist me in finding resources or relevant threads on this topic?
So far, I have found this thread https://www.reddit.com/r/redditdev/comments/11vd16y/reddit_system_designarchitecture/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button
Are the answers in it still relevant?
I also received a response from a Chat GPT, but I'm not sure about the accuracy of the information.
13 Oct 2024 Chat GTP text me about Raddit Architecture next:
Backend Architecture
Databases:
PostgreSQL for relational data (users, posts).
Cassandra for distributed, high-frequency data (e.g., votes).
Redis for caching and session management with HyperLogLog for tracking views.
Frameworks & APIs:
Microservices in Python (Flask) and Go.
RESTful API with increasing use of GraphQL for frontend queries.
Cloud Hosting & Infrastructure
Hosted on AWS:
EC2 for compute, S3 for storage, RDS for databases.
Kubernetes for container orchestration.
Caching, Load Balancing & Delivery
Redis and Memcached with Mcrouter for caching.
HAProxy for load balancing.
Spinnaker and Jenkins for CI/CD automation.
Search, Analytics & Monitoring
Lucidworks Fusion (built on Solr) for search.
Kafka and Hive for analytics, processed via EMR.
Prometheus, Grafana, and the ELK stack for monitoring and logging.
Frontend Architecture
Built with React and TypeScript.
Redux handles state management across web and mobile interfaces.
I am deeply interested in learning more about the technical infrastructure that powers Reddit. If it not NDA, I would greatly appreciate it if you could provide some insights into the current systems and services Reddit utilizes.