Django community: RSS
This page, updated regularly, aggregates Django Q&A from the Django community.
-
How do I serve my Django App in a SSH connection with limited user restrictions?
The context is the following: I'm using a SSH connection to my college's serve, on teacher's demands, in order to serve my Django application. I found out - I suppose - that by using gunicorn I can deploy my Django application. Furthermore I found out you can bind it to a specific port. When I do the following command gunicorn --bind 0.0.0.0:8000 myapp.wsgi. I proceed to my college site address, and use the port, returning 404; collegesite:8000/ What could be the problem here? Am I lacking permissions from the OS? Do I need to run gunicorn as sudo? -
Data sending with ESP8266 and receive with Django
I want to send data from ESP8266 to Django How can i? Can you please help me Thanks?! I tried this code in Django: views.py def data(request): if request.method=="POST": data = Data() data.temperature = request.POST.get('Temperature') api_key = request.POST.get('api_key') data.save() return HttpResponse("<h1>Data received and saved.</h1>") elif request.method == "GET": return HttpResponse ("<h1>Data received and save<h1>") -
How to connect bootstrap form to django form?
So, I need to have the styling of the bootstrap classes but actually have it connect to the django form in question. `` <div class="row"> <div class="col-sm-3"> <label for="last_name" class="form-label">Last Name</label> </div> <div class="col-sm-9"> <input type="text" class="form-control" placeholder="Your last name..." id="last_name" value="{{form.last_name.value}}"> </div> </div>`` {{form.last_name}} on its own for example renders the form and actually makes it connect to the database, as it should. It looks ugly though and is thus not what I need. value="{{form.last_name}}" in the above codeblock renders the start of the input tag. -
django cookiecutter shared database server
This is a very easy question, but for some reason I can’t find the answer myself. I have two microservices in Django, I combined them in Docker, but for some reason they don’t want to connect to the postgres database. So how can i resolve this. Here is my configs (im using django-cookiecutter), dockerfiles is stay standart Service A: .envs/.local/.postgres # ------------------------------------------------------------------------------ POSTGRES_HOST = postgres POSTGRES_PORT = 5432 POSTGRES_DB = service_A POSTGRES_USER = debug POSTGRES_PASSWORD = debug service B: # ------------------------------------------------------------------------------ POSTGRES_HOST = postgres POSTGRES_PORT = 5432 POSTGRES_DB = service_B POSTGRES_USER = debug POSTGRES_PASSWORD = debug docker-compose.yml version: "3.8" volumes: postgres_data: {} postgres_data_backups: {} services: service_A: build: context: service_A dockerfile: ./compose/local/django/Dockerfile image: service_A_local ports: - '8001:8001' command: /start service_B: build: context: service_B dockerfile: ./compose/local/django/Dockerfile image: service_B_local ports: - '8002:8002' command: /start postgres: build: context: . dockerfile: ./service_A/compose/production/postgres/Dockerfile image: postgres container_name: postgres volumes: - postgres_data:/var/lib/postgresql/data - postgres_data_backups:/backups env_file: - service_A/.envs/.local/.postgres - service_B/.envs/.local/.postgres -
Django framework proplems
For example, from all these movies, when the user chooses a movie, how do I show him the movie he chose specifically? How is it done? I want a detailed explanation with codes When he presses the watch button and I take him to the page for one movie, how do I show him the movie he specifically chose? -
Record Deletion from django model
if I have a live application with thousands of users in this scenario how I delete a record from Django model on live server ??? without effecting ?? first I reverse the migration then delete the record and makemigrations I try thin in local machine code and after changing I push the changes on server -
Django keeps recreating the same migration file for ManyToManyField with default value
I'm encountering a recurring issue in my Django project where Django keeps recreating the same migration file for a ManyToManyField with a default value set to MyModel.objects.all. Each time I run python manage.py makemigrations, it generates a new migration file that keeps creates the same code but increment the dependencies only, even though the model hasn't changed. When running python manage.py migrate it will show me: Running migrations: No migrations to apply. Your models in app(s): 'my_model' have changes that are not yet reflected in a migration, and so won't be applied. Run 'manage.py makemigrations' to make new migrations, and then re-run 'manage.py migrate' to apply them. Here's an example of the migration code that keeps getting regenerated: # 0016_alter_my_model_my_field_name_and_more.py class Migration(migrations.Migration): dependencies = [ ("another_service", "0004_previous_migration"), ("this_service", "0015_alter_my_model_my_field_name_and_more"), ] operations = [ migrations.AlterField( model_name="my_model", name="my_field_name", field=models.ManyToManyField( default=django.db.models.manager.BaseManager.all, help_text="My field text help.", to="another_service.my_other_model", ), ), ] The model and field are defined as this: class MyModel(PolymorphicModel): ... my_field_name = models.ManyToManyField( "another_service.AnotherModel", default=AnotherModel.objects.all, help_text="My field text help.", ) ... How can I make my_field_name to have all items from my AnotherModel on database without have the makemigrations on infinite loop to recreate this? -
Internal Server Error: MultiValueDictKeyError in Django
I have problem with my ecommerce project. When i try to add item to the cart I get this error. But item is in the cart and I get this problem only in my console. Can you please help me get rid of this error? html code - https://i.stack.imgur.com/LCns9.png js code - https://i.stack.imgur.com/6NE7L.png view / function - https://i.stack.imgur.com/mja5v.png VScode/ console - https://i.stack.imgur.com/aufFg.png browser/ console - https://i.stack.imgur.com/hcwMm.png error detail - https://i.stack.imgur.com/gEV1Y.png -
pyCharm warning in django code: Unresolved attribute reference 'user' for 'WSGIRequest'
in my brandnew PyCharm professional edition I get the warning "Unresolved attribute reference 'user' for 'WSGIRequest'". The warning occurs in the first line of MyClass.post() (see code below). from django.views import generic from django.db import models from django.http.request import HttpRequest from django.shortcuts import render from django.contrib.auth.models import AbstractUser class CustomUser( AbstractUser ): emailname = models.CharField() class Invoice( models.Model ): posted = models.DateField() class BaseView ( generic.DetailView ): model = Invoice def __int__( self, *args, **kwargs ): super().__init__( *args, **kwargs ) # some more code class MyClass( BaseView ): def __int__( self, *args, **kwargs ): super().__init__( *args, **kwargs ) # some more code def post( self, request: HttpRequest, *args, **kwargs ): context = { 'curr_user': request.user, } # <-- Unresolved attribute reference 'user' for class 'WSGIRequest' # some more code html_page = 'mypage.html' return render( request, html_page, context ) user ist a CustomUser object. Debugging shows, that user is a known attribute in CustomUser and the code works well. Other attributes of request like request.path or request.REQUEST are not warned. Django support is enabled. I work with PyCharm 2023.2.5 (Professional Edition), Windows 11 Can anybody help? What is my mistake? Regards -
Django ldap search restriction by group does not work, search by CN does not work
If I search with this query: AUTH_LDAP_USER_SEARCH = LDAPSearch("OU=E,DC=i,DC=e,DC=int", ldap.SCOPE_SUBTREE, "(sAMAccountName=%(user)s)") The person is located and logged in. If with a request like this AUTH_LDAP_USER_SEARCH = LDAPSearch("CN=allow,OU=Groups,DC=i,DC=e,DC=int", ldap.SCOPE_SUBTREE, "(sAMAccountName=%(user)s)") I get the following error: Invoking search_s('CN=allow,OU=Groups,DC=i,DC=e,DC=int', 2, '(sAMAccountName=a.t)') search_s('CN=allow,OU=Groups,DC=i,DC=e,DC=int', 2, '(sAMAccountName=%(user)s)') returned 0 objects: Authentication failed for a.t: failed to map the username to a DN. Why? I need to allow access to people in a limited group. If I do this AUTH_LDAP_REQUIRE_GROUP = "CN=allow,OU=Groups,DC=i,DC=e,DC=int" AUTH_LDAP_GROUP_TYPE = GroupOfUniqueNamesType() AUTH_LDAP_GROUP_SEARCH = LDAPSearch( "CN=allow,OU=Groups,DC=i,DC=e,DC=int", ldap.SCOPE_SUBTREE, "(objectClass=groupOfNames)") I get this error: cn=Tim Allen,ou=1,ou=2,ou=e,dc=i,dc=e,dc=int is not a member of cn=allow,ou=groups,dc=i,dc=e,dc=int Authentication failed for a.t: user does not satisfy AUTH_LDAP_REQUIRE_GROUP In this case, of course, there is a user in the group. Please tell me what to fix? Thank you -
How to create blob column via Python Migrate
I know it is a bad idea to stored whole file binary into database blob column, so forgive me I have a model defined in python code: class UploadFile(models.Model): id = models.AutoField(auto_created=True, primary_key=True) FileBinary = models.FileField(blank=False, default=None) Then I run migration python manage.py makemigrations && python manage.py migrate But in Mysql I have Columns: COLUMN_NAME DATA_TYPE id int FileBinary varchar How I fixed the code to create blob instead of varchar? -
Annotate Django Foreign Key model instance
Is it possible in Django to annotate a Foreign Key instance? Suppose I have the following models: class BaseModel(models.Model): pass class Foo(models.Model): base_model = models.ForeignKey('BaseModel', related_name='foos') class Bar(models.Model): base_model = models.ForeignKey('BaseModel', related_name='bars') I want to count the Bars belonging to a BaseModel attached to a Foo, that is: foos = Foo.objects.all() for foo in foos: foo.base_model.bars_count = foo.base_model.bars.count() Is it possible in a single query? The following code is syntactically wrong: foos = Foo.objects.annotate( base_model.bars_count=Count('base_model__bars') ) This one would perform that job in a single query: foos = Foo.objects.annotate( base_model_bars_count=Count('base_model__bars') ) for foo in foos: foo.base_model.bars_count = foo.base_model_bars_count Is there a way with a single query without the loop? -
Double Increment Issue in TotalVote Count After Payment in django App
I'm currently developing a talent application where participants can receive votes from voters, and each vote incurs a cost. The collected funds from these votes are meant to be credited to the respective contestants. For instance, if someone casts a vote for a contestant with a count of 10 votes, the contestant's total vote count should increase by the number of votes received. However, I've encountered an issue after a user casts a vote and makes a payment. It appears that the total vote count is incrementing by twice the intended value. For example, if a user submits a vote of 10, the contestant's total vote count is increasing by 20, which is unexpected and seems incorrect. Any assistance in resolving this issue would be greatly appreciated. Please find my code below: def contestant_detail_payment(request, slug, *args, **kwargs): contestant = get_object_or_404(Contestant, slug=slug) vote_charge = contestant.get_vote_per_charge() if request.method == "POST": email = request.POST.get("email", "") vote = request.POST.get("vote", "") amount = int(request.POST["amount"].replace("₦", "")) pk = settings.PAYSTACK_PUBLIC_KEY payment = Payment.objects.create( contestant=contestant, amount=amount, email=email, vote=vote, ) payment.save() context = { "payment": payment, "field_values": request.POST, "paystack_pub_key": pk, "amount_value": payment.amount_value(), "contestant": contestant, } return render(request, "competitions/make_payment.html", context) context = { "contestant": contestant, "vote_charge": vote_charge, } return render(request, … -
How do I put an ID?
I have a Django backend project connected to a Mongo database. I want to manually and personally define the field ID inside this class to be stored in the database so that Django and Mongo can communicate with each other so that I can edit or delete through the Django admin panel. from djongo import models class Category2(models.Model): # id = name = models.CharField(max_length=200) slug = models.SlugField(max_length=200, unique=True) price = models.PositiveIntegerField() class Meta: ordering = ('name',) def __str__(self): return self.name Unfortunately, I have no idea how to do it -
Why are styles not pulled when running a django application through nginx?
I know that I have a large number of similar questions, and I tried probably all the proposed options, but it didn't help me, maybe I'm doing something wrong. The problem is that when i start normal all the styles are pulled up and everything works, but when i do it through the server for production, they disappear Here is an example of the code used: settings.py: STATIC_URL = '/static/' STATIC_ROOT = os.path.join(BASE_DIR, "staticfiles") nginx.conf: upstream hello_django { server web:8000; } server { listen 80; location / { proxy_pass http://hello_django; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $host; proxy_redirect off; } location /static/ { alias /home/app/web/staticfiles/; } location /media/ { alias /home/app/web/mediafiles/; } } docker-compose.prod.yml: version: '3.8' services: web: build: context: ./app dockerfile: Dockerfile.prod command: gunicorn hello_django.wsgi:application --bind 0.0.0.0:8000 volumes: - static_volume:/home/app/web/static - media_volume:/home/app/web/media expose: - 8000 env_file: - ./.env.prod depends_on: - db healthcheck: test: ["CMD-SHELL", "pg_isready -U hello_django -h db"] interval: 10s timeout: 5s retries: 3 db: image: postgres:15 volumes: - postgres_data:/var/lib/postgresql/data/ environment: - POSTGRES_USER=hello_django - POSTGRES_PASSWORD=hello_django - POSTGRES_DB=hello_django_prod nginx: build: ./nginx volumes: - static_volume:/home/app/web/staticfiles - media_volume:/home/app/web/mediafiles ports: - 1337:80 depends_on: - web volumes: postgres_data: static_volume: media_volume: Maybe someone has already solved a similar problem and can help me with it? -
How to avoid sorting of Dictonary keys in Ajax response
I am sending an ajax call to get data from server side. $.ajax({ dataType: 'json', type: "POST", data: { 'data':["100","102","101"] , "compare_data": 0, "category": category_val, "type": type }, url: "{% url 'test:test_search' %}", beforeSend: function () { $("#loader-div").show(); }, success: function (data) { console.log(data,"data") } }); from the server side I am returning JsonReponse. Attaching sample here Response: response={"100":"test","102":"test12","101":"test13"} return JsonResponse({'status':'200','response':response}) But when I see the response dictonary in javascript it is changing as {"100":"test","101":"test13","102":"test12"} The response is getting sorted. Is there a way to stop sorting of keys ? I tried changing the type of key to string, even the values are getting sorted. -
Signup form in Django not showing checkboxinput as checkboxinput
I want to add "Accept termns and conditions" to signup form in my Django project. As I did before in a different form I added this: class SignUpForm(UserCreationForm): email = forms.CharField(max_length=254, required=True, widget=forms.EmailInput()) -> accept = forms.BooleanField(required=True, widget=CheckboxInput, label="I Accept the TNC!") class Meta: model = User fields = ('username', 'email', 'password1', 'password2') It shows you this: enter image description here I can not check it, I can not write in it. While in a different form I added the same way and that works normally. It is really weird for me, I am only a beginner to Django. I tried with empty argument section like this: class SignUpForm(UserCreationForm): email = forms.CharField(max_length=254, required=True, widget=forms.EmailInput()) accept = forms.BooleanField() class Meta: model = User fields = ('username', 'email', 'password1', 'password2') Showing the same. -
How to write an excel file on user desktops using a Python program running on AWS
We have an application, the UI of which is developed using React and is running on user’s laptops/desktops, and the backend is developed using Django and is running on AWS. When we create an excel output of a dataframe, it is getting saved in ec2 instead of the user’s laptop, which is quite understandable. But how do we create these output files on the individual user’s laptops, instead of ec2, since these users are not having direct access to AWS? -
When trying test django.db.utils.OperationalError: (1071, 'Specified key was too long; max key length is 3072 bytes')
When trying test django python manage.py test It shows the error like this, django.db.utils.OperationalError: (1071, 'Specified key was too long; max key length is 3072 bytes') mysql version is mysql Ver 8.1.0 for macos14.0 on arm64 (Homebrew) It occurs only when testing. I don't do any special settiing for testing database. 'default': { 'ENGINE': 'django.db.backends.mysql', "NAME": config("DB_NAME"), "USER": config("DB_USER"), "PASSWORD": config("DB_PASSWORD"), "HOST": config("DB_HOST"), "PORT": "3306", 'OPTIONS': { 'charset': 'utf8mb4', }, } -
django-helpdesk 0.4.1 "extendMarkdown() missing 1 required positional argument: 'md_globals'"
When I try to view /tickets/1/ I get the error: extendMarkdown() missing 1 required positional argument: 'md_globals' /usr/local/lib/python3.9/dist-packages/markdown/core.py, line 115, in registerExtensions How can I fix this? -
How to save logical expressions in django?
I have a model in my django project like this: class Condition(models.Model): activity = models.CharField(max_length=50) operator = models.CharField(max_length=50) condition = models.IntegerField() Suppose we store the user's activity in another model called UserActivity and we want to check whether the user has performed that activity more than the condition times. But this is not enough. I want to combine these Condition objects with logical operators like AND and OR and XOR and NOT etc. Can you help me to do this? Suppose the person who wants to create rules does not have engineering knowledge and I have to think about comfortable user experience -
Django failing to import app in a different directory
A codebase which is working in another local machine isn't working on mine and this is taking far more time that I have expected. Command to run the application: python3 manage.py runserver --settings=config.settings.local Directory: --- Config ----> Settings ----> urls.py ----> wsgi.py ----> asgi.py ----> __init__.py --- Pybo ----> views ----> apps.py ----> urls.py ----> views ------> indicator_views.py ------> profile_check.py After running the command, I am encountering 404 Error saying "No Board matches the given query". If I try to view the definition of the imported app by clicking from pybo.views import indicator_views, profile_check inside the /config/urls.py/, the intellisense complains No definition found for views. Thus, I'm suspecting that the config/urls.py isn't correctly importing components located inside the other directory. Is there a way that I can solve this? It's my first time using Django, and I believe the structure is a bit weirdly established but I was given this codebase and I should apparently start from what I have. I also tried running in a venv, but it would complain that background-task isn't installed. And once I finish installing the package, it complains that urls.conf is deprecated so abandoned using venv. -
Is it possible to annotate a column which contains a queryset?
I want to show a table on a HTML template with data from GroupCaseFile model. But one of the columns must contain data from the most recent ProceduralAct object, specifically: 'act_number', 'act_type', and 'act_summary' (inside a single cell of the HTML table). I could simply use something like this to get it: group_case_file.case_file.proceduralact_set.all().order_by('-created_at').first() But, before getting the last ProceduralAct, I want to apply filtering by search term. The search term should cover 'act_number', 'act_type', and 'act_summary' from the corresponding last ProceduralAct. To be able to do this search, I have tried using nested subqueries, like I did with 'summary' on the code below, to annotate 'summary' value in the GroupCaseFile query, and only then apply the filtering by search term. I think this would be enough if I wanted to get search only 'summary'. But I want to know if there is any way to annotate a column, let's say last_procedural_act, which can contain a QuerySet of the corresponding most recent ProceduralAct row. That way, I could perform search on all of the fields of that ProceduralAct row I'd appreciate if there is a different approach for this. Models class CaseFile(models.Model): code = models.CharField(primary_key=True, max_length=30) # More fields class GroupCaseFile(models.Model): … -
Run Django query and time out after X seconds
I want to build a function which takes a query in MariaDB/MySQL (it's the same engine in Django) and times out after X seconds and executes a faster, heuristical query. I found a PostGreSQL example: with transaction.atomic(), connection.cursor() as cursor: cursor.execute("SET LOCAL statement_timeout TO 50;") try: return cursor.execute(my_query) except OperationalError: pass with connection.cursor() as cursor: cursor.execute(heuristic_query) return int(cursor.fetchone()[0]) I want to do the same for MariaDB/MySQL but I read that a) you have to set it directly before the query AND that the way max_statement_time is implemented in MariaDB and MySQL differs from one another. Any ideas how I would build such a thing? Thx! -
How to keep scheduled jobs of celery worker in docker container? (django, celery, redis, docker)
I deploy my django app with docker. When I update my deployment, I stop current running container and start new container with new image. I start my celery worker in the created container. (exec -> celery worker) The problem is that the scheduled jobs (not executed yet) of celery in old container are gone since the worker gets shutdown when the container the worker is in is stopped. I understand why it happens but cannot think of how to make celery worker persistent because I think the worker is dependent on the django app. I thought the broker keeps the job and new celery worker gets connected with the broker again so the new worker would execute the jobs, receiving them from the broker. So I kept the redis container (broker) intact, not stopping or restarting it. But the new worker did not execute the previously scheduled jobs. (I think I misunderstood how brokers work) I think there can be two ways store jobs in somewhere and get all celery workers refer to it make persistent celery worker but not sure if they are possibile. Any solutions are welcome. Thank you!