Lanka Developers Community

    Lanka Developers

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Shop
    1. Home
    2. Groups
    3. Linux-Help

    Group Details

    Linux-Help

    Member List

    dinlinux dinlinux
    ciaompe ciaompe
    Ayesh Dulanja Ayesh Dulanja
    X Xenon
    ch_scripter ch_scripter
    oshan.wisumperuma oshan.wisumperuma
    yasiru thimiuru yasiru thimiuru
    I ishan m herath
    Pasindu Vishmika Pasindu Vishmika
    heshan.feedback heshan.feedback
    harshasanmix harshasanmix
    binurayeshan binurayeshan
    MonsteR_X MonsteR_X
    MrCentimetre MrCentimetre
    Piumal Kavinda Piumal Kavinda
    Sajix_sique Sajix_sique
    Rusiru Athukorala Rusiru Athukorala
    imadusanka imadusanka
    Kenthiran Kenthiran
    Thushal Kulatilake Thushal Kulatilake
    • RE: The third-party login system is not working on LankaDevelopers.lk.

      Apologies for the late reply. We have now resolved all the login issues with the third-party login system. If you have any further questions or need additional assistance, please let us know.

      posted in Comments & Feedback
      ciaompe
      ciaompe
    • RE: The third-party login system is not working on LankaDevelopers.lk.

      ohh thanks for the reporting, we'll solve asap;.

      posted in Comments & Feedback
      root
      root
    • RE: TALL STACK - Application optimize

      what you mean optimize ?

      posted in Laravel Framework
      root
      root
    • RE: Mora UXplore 1.0 - University of Moratuwa IEEE Student Branch

      Glad here that, thanks for the sharing ,

      posted in Events ( Free & Paid )
      root
      root
    • Mora UXplore 1.0 - University of Moratuwa IEEE Student Branch

      c516f658-5c2a-4806-aac1-71297db5b0fe-WhatsApp Image 2023-04-12 at 11.06.15.jpg

      " Even the greatest was once a beginner. Don't be afraid to take that first step "
      - Muhammad Ali

      Mora UXplore 1.0 is a UI/UX designing competition for undergraduates from any university. It aims to offer a platform for them to explore, learn, and enhance their skills in this field. 🀩✨

      πŸ”΄ REGISTRATION IS NOW OPEN πŸ”΄

      Click the link below and take the first step to be the bestπŸ‘‡
      https://morauxplore.lk

      For more info join with our Facebook Event:
      https://fb.me/e/2AyV3rg8Q

      Caption by- D.M.M. Shehan
      Design by- Dilina Raveen

      -Inspired by PASSION to Transform beyond EXCELLENCE-

      #MoraUXplore1.0
      #IEEESBUOM
      #TERM22/23

      posted in Events ( Free & Paid )
      MrCentimetre
      MrCentimetre
    • RE: Join the Festivities! Win a Free T-Shirt by Becoming a Part of Lanka Developers Community this New Year!

      Done!

      #NewYearGiveaway
      #WinWithLankaDevelopers
      #JoinOurCommunity
      #FreeTShirts
      #ContributeToWin

      posted in Announcements
      MrCentimetre
      MrCentimetre
    • RE: Lanka Developers Community T-shirts

      @kupeshanth

      you can place the order here

      posted in Announcements
      root
      root
    • LocalAI: A drop-in replacement for OpenAI

      LocalAI

      LaMA, alpaca, gpt4all, vicuna, koala, gpt4all-j


      Self-hosted, community-driven simple local OpenAI-compatible API written in go. Can be used as a drop-in replacement for OpenAI, running on CPU with consumer-grade hardware. Supports ggml compatible models: LLaMA, alpaca, gpt4all, vicuna, koala, gpt4all-j


      Using LocalAI is straightforward and easy. You can simply install LocalAI on your local machine or server via docker and start performing inferencing tasks immediately, no more talkings let's start ,

      1. Install docker in your pc or server ( installation depend on the os type, check here)

      2. Open terminal or cmd and clone the LocalAi repo from github

          git clone https://github.com/go-skynet/LocalAI
        
      3. Go to the LocalAi/models folder in terminal

          cd LocalAi/models
        
      4. Download the model ( in here i use gpt4all-j model , this model coming with Apache 2.0 Licensed , it can be used for commercial purposes.)

         wget https://gpt4all.io/models/ggml-gpt4all-j.bin
        

        in here i use wget for download, you can download bin file manually and copy paste to the LocalAi/models folder

      5. Come back to the LocalAi root

      6. Start with docker-compose

           docker compose up -d --build
        

      After above process finished, let's call our LocalAi via terminal or cmd , in here i use curl you can use also any other tool can perform http request ( postman, etc.. )

      curl http://localhost:8080/v1/models
      

      This request showing what are the models we have added to the models directory

      Let's call the AI with actual Prompt.

      curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
           "model": "ggml-gpt4all-j",            
           "prompt": "Explain AI to me like A five-year-old",
           "temperature": 0.7
         }'
      

      Windows compatibility

      It should work, however you need to make sure you give enough resources to the container. See

      Kubernetes
      You can run the API in Kubernetes, see an example deployment in kubernetes


      API Support
      LocalAI provides an API for running text generation as a service, that follows the OpenAI reference and can be used as a drop-in. The models once loaded the first time will be kept in memory.

      Example of starting the API with docker:

      docker run -p 8080:8080 -ti --rm quay.io/go-skynet/local-api:latest --models-path /path/to/models --context-size 700 --threads 4
      

      Then you'll see:

      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” 
      β”‚                   Fiber v2.42.0                   β”‚ 
      β”‚               http://127.0.0.1:8080               β”‚ 
      β”‚       (bound on host 0.0.0.0 and port 8080)       β”‚ 
      β”‚                                                   β”‚ 
      β”‚ Handlers ............. 1  Processes ........... 1 β”‚ 
      β”‚ Prefork ....... Disabled  PID ................. 1 β”‚ 
      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ 
      

      if you want more info about API, go to the github page
      https://github.com/go-skynet/LocalAI#api

      posted in Artificial Intelligence
      root
      root
    • RE: Do anyone works with cardcom payment gateway?

      @rohandhananjaya

      your welcome,

      do some reverse engineering for the laravel-cardcom. i think you can convert this to php package removing laravel facade. if you want i'll help you.

      posted in Web Development
      lkdev
      lkdev
    • RE: Do anyone works with cardcom payment gateway?

      also check this

      https://github.com/yadahan/laravel-cardcom

      posted in Web Development
      lkdev
      lkdev