Django community: RSS
This page, updated regularly, aggregates Django Q&A from the Django community.
-
How to stop StreamingHttpResponse in Django on Google Cloud Run?
We have integrated the GPT API in our Django application running on Google Cloud Run. When a user makes a request, we send them a response using StreamingHttpResponse from django.http, enabling real-time streaming. However, we currently do not have a way for users to stop an ongoing StreamingHttpResponse. We are looking for a solution to terminate the stream early if needed—without using WebSockets and without relying on Redis or other services that require VPC connectors, as they are costly for us at the moment. Is there a way to achieve this within our existing Google Cloud Run setup? -
"LookupError: No installed app with label 'admin'." when using muppy in django
I have a django + drf application that has no admin site, which works very well for us. However, when using pympler and muppy like this: class DashboardViewSet( SpecialEndpoint, ): def list(self, request, *args, **kwargs): from pympler import tracker tr = tracker.SummaryTracker() [...] tr.print_diff() return Response(...) I get this error: File "src/api/views/case_manager/dashboard/dashboard_viewset.py", line 33, in list tr = tracker.SummaryTracker() File "lib/python3.13/site-packages/pympler/tracker.py", line 45, in __init__ self.s0 = summary.summarize(muppy.get_objects()) ~~~~~~~~~~~~~~~~~^^ File "lib/python3.13/site-packages/pympler/muppy.py", line 42, in get_objects tmp = [o for o in tmp if not ignore_object(o)] ~~~~~~~~~~~~~^^^ File "lib/python3.13/site-packages/pympler/muppy.py", line 17, in ignore_object return isframe(obj) File "lib/python3.13/inspect.py", line 507, in isframe return isinstance(object, types.FrameType) File "lib/python3.13/site-packages/django/utils/functional.py", line 280, in __getattribute__ value = super().__getattribute__(name) File "lib/python3.13/site-packages/django/utils/functional.py", line 251, in inner self._setup() ~~~~~~~~~~~^^ File "lib/python3.13/site-packages/django/contrib/admin/sites.py", line 605, in _setup AdminSiteClass = import_string(apps.get_app_config("admin").default_site) ~~~~~~~~~~~~~~~~~~~^^^^^^^^^ File "lib/python3.13/site-packages/django/apps/registry.py", line 165, in get_app_config raise LookupError(message) LookupError: No installed app with label 'admin'. This seems to happen as a DefaultAdminSite is always created, lazily, in a global variable, which muppy accesses to get the size. Any idea how to work around this? -
Django Email Configuration: SSL Certificate Verification Failed with GoDaddy SMTP Server
I'm trying to configure Django to send emails using GoDaddy's SMTP server (smtpout.secureserver.net). My email account was created on GoDaddy, and I have the following settings in my settings.py file: import os MAIL_BACKEND = 'django.core.mail.backends.smtp.EmailBackend' EMAIL_HOST = 'smtpout.secureserver.net' EMAIL_HOST_USER = os.environ.get("EMAIL_HOST") EMAIL_HOST_PASSWORD = os.environ.get("EMAIL_HOST_PASSWORD") DEFAULT_FROM_EMAIL = os.environ.get("EMAIL_HOST") EMAIL_PORT = 465 EMAIL_USE_SSL = True EMAIL_USE_TLS = False I've set my environment variables as follows: EMAIL_HOST = ABC@dagger.com # GoDaddy email EMAIL_HOST_PASSWORD = Abc@1223 # GoDaddy email password However, when trying to send an email, I get the following error: File "C:\Users\jinal.desai\AppData\Local\Programs\Python\Python310\lib\ssl.py", line 1071, in _create self.do_handshake() File "C:\Users\jinal.desai\AppData\Local\Programs\Python\Python310\lib\ssl.py", line 1342, in do_handshake self._sslobj.do_handshake() ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007) What I've Tried: Verified that my GoDaddy email credentials are correct. Tried setting EMAIL_USE_TLS = True and EMAIL_USE_SSL = False. Used EMAIL_PORT = 587 instead of 465. Manually verified that smtpout.secureserver.net works using an external email client. Attempted to disable SSL certificate verification using: import ssl ssl._create_default_https_context = ssl._create_unverified_context Question: How can I fix this SSL certificate verification error with GoDaddy's SMTP? Do I need to install or trust a specific SSL certificate for GoDaddy's mail server? Is there any known workaround to resolve this … -
Django admin: strange view
Couldnt understand what wrong with my admin part. It looks like that: I understood that this effect may depend on 'static' resources.. HTML preview shows: <head> <title>Log in | Django site admin</title> <link rel="stylesheet" href="/staticfiles/admin/css/base.css"> <link rel="stylesheet" href="/staticfiles/admin/css/dark_mode.css"> <script src="/staticfiles/admin/js/theme.js"></script> <link rel="stylesheet" href="/staticfiles/admin/css/nav_sidebar.css"> <script src="/staticfiles/admin/js/nav_sidebar.js" defer></script> <link rel="stylesheet" href="/staticfiles/admin/css/login.css"> and I checked - all files exist inbb staticfiles directory. But when i click file in chrome - such page appears Page not found (404) Request Method: GET Request URL: http://amodule.su/staticfiles/admin/css/nav_sidebar.css Using the URLconf defined in amodule.urls, Django tried these URL patterns, in this order: [name='home'] news/ admin/ The current path, staticfiles/admin/css/nav_sidebar.css, didn’t match any of these. my settings.py contains such definitions of static: STATIC_URL = 'staticfiles/' DEFAULT_AUTO_FIELD = 'django.db.models.BigAutoField' MEDIA_ROOT = os.path.join(BASE_DIR, 'media') MEDIA_URL = '/media/' STATIC_ROOT = os.path.join(BASE_DIR, 'staticfiles') where am I wrong? -
Handling multiple related objects in the admin without using inlines
I am customising Django's admin for a particular view on some data. Where I have got so far I have a Project model, and a ProjectObjectConditions model that has a ForeignKey to it (the actual models (on Github)): class Project(models.Model): [...] class ProjectObjectiveCondition(models.Model): project = models.ForeignKey(Project, on_delete=models.CASCADE) With a custom inline admin template (on Github), the inlines look like this: What you are seeing there is: Objective: Agreeableness Level: Started Condition: Speaks pleasantly Condition: Doesn't shout Level: First results Condition: Accepts thanks with gracce Level: Mature results Condition: Apologises with sincerity Objective: Colourfulness Level: Started Condition: Has blue ... and so on The Conditions are grouped by Objective and Level through repeated use of {% regroup %} in the template. Date/commitment columns You can also see columns for the dates (instances of the WorkCycle class) and their True/False values, that show whether a commitment was made for this Project, and to reach which Level, in which WorkCycle (i.e. in which year). I want to replace those with editable tick-boxes. I actually have another inline, LevelCommitmentInline, which allows me to edit those True/False values for that model - but I don't want them in a long list somewhere else, I want … -
React Blog Page Not Loading & Comments Section Not Working
I am working on a React blog page where users can view blog posts and leave comments (similar to YouTube's comment section). However, after clicking on the blog page and clicking on a blog I wish to view the content no longer loads properly and is just stuck saying loading so the contents are no longer visible on the specified page and the comment section isn't visible either. What I Tried & Expected Outcome: Fetching Blog Posts: Used axios.get('http://127.0.0.1:8000/api/blogs/') inside useEffect(). I had Expected posts to load, but they do not appear. Deleting Blog Posts: Used axios.delete(\http://127.0.0.1:8000/api/blogs/delete/$%7BblogId%7D/%5C%5C%5C%60, { headers: { Authorization: `Token ${user.token}` } })`. Expected blog deletion to work, but encountered errors. Commenting System: Implemented a comment section using axios.post() to submit new comments but nothing is being displayed at all its just stuck saying loading. Expected comments to display but they do not appear. Here's The backend snippet of the code im trying to run @api_view(['GET', 'POST']) @authentication_classes([TokenAuthentication]) @permission_classes([IsAuthenticated]) def blog_comments(request, blog_id): """ GET /blogs/<blog_id>/comments/ Returns: All comments for specified blog POST /blogs/<blog_id>/comments/ Required data: content, parent (optional for replies) Returns: Created comment data Requires: Authentication Token """ blog = get_object_or_404(Blog, id=blog_id) if request.method == 'GET': comments = … -
Embed a Django context variable into an existing string
In my Django project I have this template snippet <button type="button" onClick="window.location.href='product-without-subscription/{{ context_variable_from_view }}'" > My Button </button> Here the context_variable_from_view comes from my context prior html rendering. In the rendered page I can't get the context_variable_from_view embed inside the link string in the button. -
Import Error: module does not define a "CustomJWTAuthentication" attribute/class
I'm building a REST Auth API with Django/DRF. All of a sudden when I start working today, I'm getting this error message in my cli: ImportError: Could not import 'users.authentication.CustomJWTAuthentication' for API setting 'DEFAULT_AUTHENTICATION_CLASSES'. ImportError: Module "users.authentication" does not define a "CustomJWTAuthentication" attribute/class. This is my REST_FRAMEWORK config in settings.py REST_FRAMEWORK = { 'DEFAULT_AUTHENTICATION_CLASSES': [ 'users.authentication.CustomJWTAuthentication', ], 'DEFAULT_PERMISSION_CLASSES': [ 'rest_framework.permissions.IsAuthenticated', ], ... } This is my /users/authentication.py, which has the CustomJWTAuthentication class: from django.conf import settings from rest_framework_simplejwt.authentication import JWTAuthentication class CustomJWTAuthentication(JWTAuthentication): def authenticate(self, request): try: header = self.get_header(request) if header is None: raw_token = request.COOKIES.get(settings.AUTH_COOKIE) else: raw_token = self.get_raw_token(header) if raw_token is None: return None validated_token = self.get_validated_token(raw_token) return self.get_user(validated_token), validated_token except: return None I'm running Python v3.12, Django v4.2, DRF v3.14 and DRF SimpleJWT v5.4 on Ubuntu 24 in a venv. I have no idea why this is happening all of a sudden? -
Unit testing Amazon SES in Django: emails not being sent
Creating unit tests for Amazon Simple Email Service (SES) for a Django application using package django-ses test_mail.py from django.core import mail ... def test_send_direct_email(send_ct): from_email = settings.SERVER_EMAIL to_email = [nt[2] for nt in settings.NOTIFICATIONS_TESTERS] starttime = datetime.now() connection = mail.get_connection() pre_data = get_ses_emails_data() _mail_signal_assertion_handler.call_count = 0 signals.message_sent.connect(_mail_signal_assertion_handler) emails = [] for i in range(send_ct): emails.append( mail.EmailMessage( SUBJECT_EMAIL, BODY_EMAIL.format(send_ct=i, server=settings.EMAIL_BACKEND), from_email, to_email, # connection=connection, ) ) connection.send_messages(emails) post_data = get_ses_emails_data() assert int(post_data["24hour_sent"]) == int(pre_data["24hour_sent"]) + send_ct assert check_aws_ses_sent(assertions={"Sent": send_ct, "StartTime": starttime}) assert _mail_signal_assertion_handler.call_count == send_ct settings.py AWS_DEFAULT_REGION = "ca-central-1" try: # IAM programmatic user AWS_ACCESS_KEY_ID = env("AWS_ACCESS_KEY_ID") AWS_SECRET_ACCESS_KEY = env("AWS_SECRET_ACCESS_KEY") except KeyError: raise ImproperlyConfigured("Missing AWS_ACCESS_KEY_ID or AWS_SECRET_ACCESS_KEY") # =========== EMAIL ============== EMAIL_BACKEND = "django_ses.SESBackend" DEFAULT_FROM_EMAIL = env("DEFAULT_FROM_EMAIL") # verified aws ses identity SERVER_EMAIL = DEFAULT_FROM_EMAIL but the emails are never sent (AssertionFrror: False (0 == 1). The service is working as expected when running live on the server. The assertions I am using are a connection to the message_sent signal (new in 4.4.0) from django_ses import signals def _mail_signal_assertion_handler(sender, message, **kwargs): _mail_signal_assertion_handler.call_count += 1 assert message.subject == SUBJECT_EMAIL assert message.body == BODY_EMAIL.format( send_ct=_mail_signal_assertion_handler.call_count, server=settings.EMAIL_BACKEND ) signals.message_sent.connect(_mail_signal_assertion_handler) and checking the SES data through a boto3 client session: from django_ses.views import emails_parse, stats_to_list, … -
Gmail Oauth2 - restrict the scope to only emails from a certain domain
I have a Django site that uses Google Oauth2 to allow users to grant access to read and reply to their emails. GOOGLE_OAUTH2_CREDENTIALS = { 'client_id': '********************', 'client_secret': '*******', 'scope': [ 'https://www.googleapis.com/auth/gmail.readonly', 'https://www.googleapis.com/auth/gmail.send' ], 'redirect_uri': 'https://www.********.com/*****/', } However, for privacy and security purposes I want to set restrict the scope to only being able to read and reply to emails from a specific domain. Is it possible to modify the scope to only allow the permissions within for emails to/from a certain domain? -
How to filter data obtained through annotation?
There are 'images' that are attached to 'objects' through a ForeignKey, they can be several at each 'object'. There are 'subjects' that are also attached to 'objects' through ForeignKey. How to attach 'subject' one image from the 'object', noticed "select=1"? Through annotation, I can get either the number of images or all images. Options that work but that's not what you need Subject.objects.filter(object_id__hromada_id=hromada.id, state=3).annotate(image=Count('object__objectimages__image', filter=Q(object__objectimages__select=1))) or Subject.objects.filter(object_id__hromada_id=hromada.id, state=3).annotate(image=F('object__objectimages__image')) -
CSV Timed RotatingFileHandler not rotating files
We need to fulfil a request for our Python (v3.11.7) Django (v3.2.23) app to log specific security related events on a csv file that will be rotated on an hourly basis and have a filename like audit_logs20250130_0800-0900.csv. Our Django back-end is running on a docker container with an entrypoint like gunicorn wsgi:application --bind 0.0.0.0:8000 --workers 4 --threads 4 We are trying to implement this by inheriting from logging.handlers.TimedRotatingFileHandler to implement a CSVTimedRotatingFileHandler that looks like this: import logging import os from datetime import datetime, timedelta from logging.handlers import TimedRotatingFileHandler import pytz import redis from django.conf import settings REDIS_KEY = 'CSVTimedRotatingFileHandler_RolloverAt' class CSVTimedRotatingFileHandler(TimedRotatingFileHandler): def __init__(self, filename, when, interval, backup_count, encoding=None, delay=False, headers=None): super().__init__(filename, when=when, interval=interval, backupCount=backup_count, encoding=encoding, delay=delay, utc=False, atTime=None, errors=None) self.headers = headers def emit(self, record): try: last_rollover_at = self.get_redis_rollover_at_value() # Check if a rollover happened and refresh the stream if needed (for multiple workers) if self.rolloverAt != last_rollover_at: self.rolloverAt = last_rollover_at if self.stream and not self.stream.closed: self.stream.close() self.stream = self._open() if self.shouldRollover(record): self.doRollover() # If the stream is still closed or None, open it again if self.stream is None or self.stream.closed: self.stream = self._open() # Write headers if the file is empty if self.stream.tell() == 0 and self.headers: self.stream.write(','.join(self.headers) … -
How to recreate the database for every Django test case?
I want to write test cases for django that use the database and change its entries. I want to create a new database for every test. How do I force django to delete the entire database after every testcase (either for every function or for every class)? I hope it doesn't matter, but I use SQLite. -
Django not loading image files with static
I have been working with Django and it isn't loading any image files but it is loading my CSS files in the same directory area. HTML page <!DOCTYPE html> <html lang="en"> {% load static %} <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <link href="{% static 'home.css' %}" type="text/css" rel="stylesheet"> <title>Hello World</title> </head> <body>`` <h1>Home</h1> <img href="{% static 'icons/folder.png' %}" alt="folder"> {% for file in files %} {% if file.1 == True %} <p><a href="/files/home/{{file.2}}">{{file}}></a></p> {% else %} <p>{{file}}</p> {% endif %} {% endfor %} </body> </html> settings.py BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__))) STATIC_URL = '/static/' STATIC_ROOT = os.path.join(BASE_DIR, 'static') This is the dictory. The CSS file loads but the image doesn't Let me know if you need any more information that I forgot to provide. -
Django and AWS Lambda runtime configuration change
I have a business application written in Django where each tenant should have a completely separate environment, including: Separate database schema Separate Redis instance Separate S3 bucket, etc. However, I want to deploy a single instance of the application on AWS Fargate or AWS Lambda to reduce management costs. Each tenant will have a different domain, and the Django configuration should dynamically change based on the tenant. My idea is to store all tenant-specific configurations (credentials, environment variables, etc.) in AWS AppConfig. For example: A single AWS RDS database with separate schemas for each tenant A shared AWS ElastiCache (Redis) instance but logically separated A single Django Celery setup Dynamic configuration loading based on the tenant Since each tenant has different database credentials, email credentials, payment gateway credentials (Stripe, PayPal, etc.), I want to ensure this approach is scalable and maintainable. My questions: Is this a good approach for multi-tenancy in Django? Are there better alternatives to managing per-tenant configurations dynamically? How should I handle tenant-based database connections efficiently in Django? Any recommendations or best practices would be greatly appreciated. -
Django REST Framework: Custom Action (POST) without Serializer
I am using Django REST Framework (DRF) and have a ModelViewSet with a custom action (@action) for canceling an order. The cancel action does not require a serializer since it only modifies the database and returns a response. However, when I set serializer_class = None, it does not work as expected. Here’s my ViewSet: class OrderViewSet(ModelViewSet): http_method_names = ['get', 'post', 'patch', 'delete', 'head', 'options'] @action(detail=True, methods=['post'], permission_classes=[IsAuthenticated], serializer_class=None #Not Working) def cancel(self, request, pk=None): order = self.get_object() OrderService.cancel_order(order, request.user) return Response({'status': 'Order cancelled'}, status=status.HTTP_200_OK) def get_serializer_class(self): if self.action == 'cancel': return None # Causes an error if self.action == 'cancel': return EmptySerializer # This way works if self.request.method == 'POST': return CreateOrderSerializer elif self.request.method == 'PATCH': return UpdateOrderSerializer return OrderSerializer Issues Setting serializer_class = None inside @action does not work. Returning None in get_serializer_class causes 'NoneType' object is not callable' error Now, I created an empty serializer to bypass DRF’s requirement but that's looks overwork for me to acheive this. Is there any better way to do that. -
How to send images as a binary data with the request
Here's what I have models.py class Post(models.Model): id = models.AutoField(primary_key=True) text = models.TextField(max_length=165) author = models.ForeignKey(settings.AUTH_USER_MODEL, on_delete=models.CASCADE) created_at = models.DateTimeField(auto_now_add=True) def __str__(self): return f'{self.author} posts "{self.text}"' class Images(models.Model): id = models.AutoField(primary_key=True) image = models.ImageField(upload_to='images/') post_id = models.ForeignKey(Post, on_delete=models.CASCADE) def __str__(self): return f'{self.post_id.id} - "{self.image}"' serializers.py class ImageSerializer(serializers.ModelSerializer): class Meta: model = Images fields = ('image',) class PostSerializer(serializers.ModelSerializer): images = ImageSerializer(many=True, read_only=True, source='images_set') author = UserSerializer(read_only=True) comments = CommentSerializer(many=True, read_only=True, source='comment_set') likes_count = serializers.SerializerMethodField() class Meta: model = Post fields = ['id', 'author', 'text', 'images', 'created_at', "likes_count", 'comments'] def create(self, validated_data): validated_data["author"] = self.context["request"].user return super().create(validated_data) def validate(self, data): if data.get('author') == self.context["request"].user: raise serializers.ValidationError('Logged in User is not an Author') return data def get_likes_count(self, obj): return obj.postlikes_set.count() views.py class NewPost(APIView): permission_classes = [IsAuthenticated] parser_classes = [JSONParser] def post(self, request): text = request.data.get("text") post_id = Post.objects.create(author_id=request.user.id ,text=text) images = request.FILES.getlist('images') for image in images: Images.objects.create(image=image, post_id=post_id) return Response({"message": "Успешно", 'received data': request.data}, status=status.HTTP_201_CREATED) I need to send post data consisting of text and multiple images. I've been told the way to do it is to send images "as binary data" and was linked docs pages for JSON Parsers. I have no idea what that actually means. If I try to send … -
Saving a django modelform to model db
the following code is not saving to the db, the code works well in shell. everything is fine but not saving to db.could someone figure it out? def loginapp(request): if request.method == "POST": form=LoginInterfaceForm(request.POST) if form.is_valid(): form.clean() login = form.save(commit=False) login.save( expecting the form data to save in db -
How to entry to a function in python breakpoint in terminal
I try to entry to a function that I called on my code but in terminal, pdb just go through it like this I share with you : -> def get_context_data(self, *args, **context): (Pdb) n --Return-- > d:\rebound\rebound\modules\blog\views.py(135)PostDetail()-><cell at 0x00...1DA4A0: empty> -> def get_context_data(self, *args, **context): so which command if I hit, can go entry to called function -
Expand a QuerySet with all related objects
class Hobby(models.Model): name = models.TextField() class Person(models.Model): name = models.TextField() created_at = models.DateTimeField(auto_now_add=True) hobbies = models.ManyToManyField(Hobby, related_name='persons') class TShirt(models.Model): name = models.TextField() person = models.ForeignKey( Person, related_name='tshirts', on_delete=models.CASCADE, ) class Shirt(models.Model): name = models.TextField() person = models.ForeignKey( Person, related_name='shirts', on_delete=models.CASCADE, ) class Shoes(models.Model): name = models.TextField() person = models.ForeignKey( Person, related_name='shoes', on_delete=models.CASCADE, ) Given a queryset of Person, e.g. Person.objects.order_by('-created_at')[:4] How can I make a queryset which also includes all the objects related to the Person objects in that queryset? The input QuerySet only has Person objects, but the output one should have Hobby, Shoes, TShirt, Shirt` objects (if there are shirts/tshirts/shoes that reference any of the people in the original queryset). I've only been able to think of solutions that rely on knowing what the related objects are, e.g. TShirt.objects.filter(person__in=person_queryset), but I would like a solution that will work for all models that reference Person without me having to one-by-one code each query for each referencing model. -
Can i user djangos admin for users also , i am buildng an algo trading bot in django [closed]
Currently, I am building an algorithmic trading bot with Django, and I don't expect many users—mainly just me and a few of my friends. Given this, do I need to implement custom templates for user interfaces, or can I simply use Django's default admin interface for managing users as well? a user can be able to activate the avalialbe startegy and the trade will automatically happen -
Queryset pagination in websocket django rest framework
I have a websocket in my django drf project and a function that should read all of Notification objects from database and return them via a serializer like bellow : @database_sync_to_async def get_all_notifications(self): paginator = CustomPageNumberPagination() notifications = Notification.objects.all().order_by('-created_at') context = paginator.paginate_queryset(notifications, WhatQuery?) serializer = NotificationSerializer(context=context, many=True) return paginator.get_paginated_response(serializer.data) I do not know how to get request in socket and I think it is not accessable in socket paginator.paginate_queryset(queryset, request) what is the solution here? -
How to create graphql type for a django model which has many-to-many field
I have django model named profiles. It has some basic fields and many-to-many field followers. This field contains a list of followers and following peoples class Profile(models.Model): user = models.OneToOneField( User, on_delete=models.CASCADE) birth_date = models.DateField( null=True, blank=True) profile_picture = models.ImageField( upload_to='user_profile_pictures/', blank=True, null=True) cover_picture = models.ImageField( upload_to='user_cover_pictures/', blank=True, null=True) profile_description = models.TextField( blank=True, null=True) profile_rating = models.IntegerField( default=0) followers = models.ManyToManyField( 'self', symmetrical=False, related_name='following', blank=True) I used chatGpt to create a type for this model class ProfileType(DjangoObjectType): class Meta: model = Profile fields = "__all__" followers = graphene.List(lambda: ProfileType) following = graphene.List(lambda: ProfileType) followers_count = graphene.Int() following_count = graphene.Int() def resolve_followers(self, info): return self.followers.all() def resolve_following(self, info): return self.following.all() def resolve_followers_count(self, info): return self.followers.count() def resolve_following_count(self, info): return self.following.count() This issue is graphene List doesn't have the all() and count() methods. How I should handle this field? -
improve the query structure or how can i use prefetch related
I have a situation and I'm not able to think of how to use prefetch_related or improve the structure of my query, here is the scenario - rows = DocumentRows.objects.filter(item=item_id).values_list("document", flat=True) qs = Document.objects.filter(id__in = rows) return qs Document contains ID and other imp related info, Document is linked to DocumentRows as a foreign key, and is of type one-to-many Relationship. Each Document can have multiple rows and each row contain item (item_id). I'm trying to filter document based on items present in document rows. Thanks -
WebSocket Returns 200 Instead of 101 with Apache2: How to Properly Configure Proxy for WebSocket Handling?
I am trying to configure Apache2 to proxy WebSocket traffic for my application, but I’m facing an issue where the WebSocket connection returns a 200 status instead of the expected 101 status code. I have the following configuration in my default-ssl.conf file for SSL and WebSocket proxy: ServerName <domain> <IfModule mod_ssl.c> <VirtualHost _default_:443> ServerAdmin webmaster@localhost DocumentRoot /home/root/app ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined SSLEngine on SSLCertificateFile /etc/ssl/certs/ssl-cert-snakeoil.pem SSLCertificateKeyFile /etc/ssl/private/ssl-cert-snakeoil.key # WebSocket upgrade headers SetEnvIf Request_URI "^/ws" upgrade_connection RequestHeader set Connection "Upgrade" env=upgrade_connection RequestHeader set Upgrade "websocket" env=upgrade_connection # Proxy to WebSocket server ProxyPreserveHost On ProxyPass /wss/ wss://127.0.0.1:8080/ ProxyPassReverse /wss/ wss://127.0.0.1:8080/ # Proxy WebSocket traffic to Daphne (ASGI) RewriteEngine On RewriteCond %{HTTP:UPGRADE} ^WebSocket$ [NC,OR] RewriteCond %{HTTP:CONNECTION} ^Upgrade$ [NC] RewriteRule .* ws://127.0.0.1:8080%{REQUEST_URI} [P,QSA,L] # Proxy settings ProxyRequests on ProxyPass / http://<ip>:8000/ ProxyPassReverse / http://<ip>:8000/ </VirtualHost> Despite this configuration, the WebSocket connection is returning an HTTP 200 status instead of the expected WebSocket handshake (101). What could be causing this issue and how can I resolve it?