Shashwat13333 commited on
Commit
f027265
·
verified ·
1 Parent(s): dce2dcd

Model save

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": true,
4
+ "pooling_mode_mean_tokens": false,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,895 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:150
11
+ - loss:MatryoshkaLoss
12
+ - loss:MultipleNegativesRankingLoss
13
+ base_model: BAAI/bge-base-en-v1.5
14
+ widget:
15
+ - source_sentence: What makes you different from other AI consultancies?
16
+ sentences:
17
+ - 'We are a New breed of innovative digital transformation agency, redefining storytelling
18
+ for an always-on world.
19
+
20
+ With roots dating back to 2017, we started as a pocket size team of enthusiasts
21
+ with a goal of helping traditional businesses transform and create dynamic, digital
22
+ cultures through disruptive strategies and agile deployment of innovative solutions.'
23
+ - " We specialize in guiding companies through the complexities of adopting and\
24
+ \ integrating Artificial Intelligence and Machine Learning technologies. Our consultancy\
25
+ \ services are designed to enhance your operational efficiency and decision-making\
26
+ \ capabilities across all sectors. With a global network of AI/ML experts and\
27
+ \ a commitment to excellence, we are your partners in transforming innovative\
28
+ \ possibilities into real-world achievements. \
29
+ \ \
30
+ \ \n DATA INTELLIGENCE PLATFORMS we specialize\
31
+ \ in\nTensorFlow\nDatabricks\nTableau\nPytorch\nOpenAI\nPinecone\""
32
+ - 'How can we get started with your DevOps solutions?
33
+
34
+ Getting started is easy. Contact us through our website. We''ll schedule a consultation
35
+ to discuss your needs, evaluate your current infrastructure, and propose a customized
36
+ DevOps solution designed to achieve your goals.'
37
+ - source_sentence: Do you provide support 24/7?
38
+ sentences:
39
+ - 'How do we do Custom Development ?
40
+
41
+ We follow below process to develop custom web or mobile Application on Agile Methodology,
42
+ breaking requirements in pieces and developing and shipping them with considering
43
+ utmost quality:
44
+
45
+ Requirements Analysis
46
+
47
+ We begin by understanding the clients needs and objectives for the website. Identify
48
+ key features, functionality, and any specific design preferences.
49
+
50
+
51
+ Project Planning
52
+
53
+ Then create a detailed project plan outlining the scope, timeline, and milestones.
54
+ Define the technology stack and development tools suitable for the project.
55
+
56
+
57
+ User Experience Design
58
+
59
+ Then comes the stage of Developing wireframes or prototypes to visualize the website's
60
+ structure and layout. We create a custom design that aligns with the brand identity
61
+ and user experience goals.
62
+
63
+
64
+ Development
65
+
66
+ After getting Sign-off on Design from Client, we break the requirements into Sprints
67
+ on Agile Methodology, and start developing them.'
68
+ - 'This is our Portfolio
69
+
70
+ Introducing the world of Housing Finance& Banking Firm.
71
+
72
+ Corporate Website with 10 regional languages in India with analytics and user
73
+ personalization and Dashboard for Regional Managers, Sales Agents, etc. to manage
74
+ the Builder Requests, approve/deny Properties, manage visits and appointments,
75
+ manage leads, etc.
76
+
77
+
78
+
79
+ Introducing the world of Global Automotive Brand.We have implemented a Multi Locale
80
+ Multilingual Omnichannel platform for Royal Enfield. The platform supports public
81
+ websites, customer portals, internal portals, business applications for over 35+
82
+ different locations all over the world.
83
+
84
+
85
+ Developed Digital Platform for Students, Guardians, Teachers, Tutors, with AI/ML
86
+ in collaboration with Successive Technologies Inc, USA. Cloud, Dev-Sec-Ops &
87
+ Data Governance
88
+
89
+ Managing cloud provisioning and modernization alongside automated infrastructure,
90
+ event-driven microservices, containerization, DevOps, cybersecurity, and 24x7
91
+ monitoring support ensures efficient, secure, and responsive IT operations.'
92
+ - "SERVICES WE PROVIDE\nFlexible engagement models tailored to your needs\nWe specialize\
93
+ \ in comprehensive website audits that provide valuable insights and recommendations\
94
+ \ to enhance your online presence.\nDigital Strategy & Consulting\nCreating digital\
95
+ \ roadmap that transform your digital enterprise and produce a return on investment,\
96
+ \ basis our discovery framework, brainstorming sessions & current state analysis.\n\
97
+ \nPlatform Selection\nHelping you select the optimal digital experience, commerce,\
98
+ \ cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying\
99
+ \ next-gen scalable and agile enterprise digital platforms, along with multi-platform\
100
+ \ integrations. \nProduct Builds\nHelp you ideate, strategize, and engineer\
101
+ \ your product with help of our enterprise frameworks\nInfrastructure\nSpecialize\
102
+ \ in multi-cloud infrastructure helping you put forward the right cloud infrastructure\
103
+ \ and optimization strategy.\n\nManaged Services\nOperate and monitor your business-critical\
104
+ \ applications, data, and IT workloads, along with Application maintenance and\
105
+ \ operations.\nTeam Augmentation\nHelp you scale up and augment your existing\
106
+ \ team to solve your hiring challenges with our easy to deploy staff augmentation\
107
+ \ offerings.\""
108
+ - source_sentence: What challenges did the company face in its early days?
109
+ sentences:
110
+ - 'Why do we need Microservices ?
111
+
112
+ Instead of building a monolithic application where all functionalities are tightly
113
+ integrated, microservices break down the system into modular and loosely coupled
114
+ services.
115
+
116
+
117
+ Scalability
118
+
119
+ Flexibility and Agility
120
+
121
+ Resilience and Fault Isolation
122
+
123
+ Technology Diversity
124
+
125
+ Continuous Delivery'
126
+ - 'After a transformative scuba dive in the Maldives, Mayank Maggon made a pivotal
127
+ decision to depart from the corporate ladder in December 2016. Fueled by a clear
128
+ vision to revolutionize the digital landscape, Mayank set out to leverage the
129
+ best technology ingredients, crafting custom applications and digital ecosystems
130
+ tailored to clients'' specific needs, limitations, and budgets.
131
+
132
+
133
+ However, this solo journey was not without its challenges. Mayank had to initiate
134
+ the revenue engine by offering corporate trainings and conducting online batches
135
+ for tech training across the USA. He also undertook small projects and subcontracted
136
+ modules of larger projects for clients in the US, UK, and India. It was only after
137
+ this initial groundwork that Mayank was able to hire a group of interns, whom
138
+ he meticulously trained and groomed to prepare them for handling Enterprise Level
139
+ Applications. This journey reflects Mayank''s resilience, determination, and entrepreneurial
140
+ spirit in building TechChefz Digital from the ground up.
141
+
142
+
143
+ With a passion for innovation and a relentless drive for excellence, Mayank has
144
+ steered TechChefz Digital through strategic partnerships, groundbreaking projects,
145
+ and exponential growth. His leadership has been instrumental in shaping the company
146
+ into a leading force in the digital transformation arena, inspiring a culture
147
+ of innovation and excellence that continues to propel the company forward.'
148
+ - 'What makes your DevOps solutions stand out from the competition?
149
+
150
+ Our DevOps solutions stand out due to our personalized approach, extensive expertise,
151
+ and commitment to innovation. We focus on delivering measurable results, such
152
+ as reduced deployment times, improved system reliability, and enhanced security,
153
+ ensuring you get the maximum benefit from our services.'
154
+ - source_sentence: What kind of data do you leverage for AI solutions?
155
+ sentences:
156
+ - 'Our Solutions
157
+
158
+ Strategy & Digital Transformation
159
+
160
+ Innovate via digital transformation, modernize tech, craft product strategies,
161
+ enhance customer experiences, optimize data analytics, transition to cloud for
162
+ growth and efficiency
163
+
164
+
165
+ Product Engineering & Custom Development
166
+
167
+ Providing product development, enterprise web and mobile development, microservices
168
+ integrations, quality engineering, and application support services to drive innovation
169
+ and enhance operational efficiency.'
170
+ - 'In what ways can machine learning optimize our operations?
171
+
172
+ Machine learning algorithms can analyze operational data to identify inefficiencies,
173
+ predict maintenance needs, optimize supply chains, and automate repetitive tasks,
174
+ significantly improving operational efficiency and reducing costs.'
175
+ - Our AI/ML services pave the way for transformative change across industries, embodying
176
+ a client-focused approach that integrates seamlessly with human-centric innovation.
177
+ Our collaborative teams are dedicated to fostering growth, leveraging data, and
178
+ harnessing the predictive power of artificial intelligence to forge the next wave
179
+ of software excellence. We don't just deliver AI; we deliver the future.
180
+ - source_sentence: What do you guys do for digital strategy?
181
+ sentences:
182
+ - " What we do\n\nDigital Strategy\nCreating digital frameworks that transform\
183
+ \ your digital enterprise and produce a return on investment.\n\nPlatform Selection\n\
184
+ Helping you select the optimal digital experience, commerce, cloud and marketing\
185
+ \ platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable\
186
+ \ and agile enterprise digital platforms, along with multi-platform integrations.\n\
187
+ \nProduct Builds\nHelp you ideate, strategize, and engineer your product with\
188
+ \ help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and\
189
+ \ augment your existing team to solve your hiring challenges with our easy to\
190
+ \ deploy staff augmentation offerings .\nManaged Services\nOperate and monitor\
191
+ \ your business-critical applications, data, and IT workloads, along with Application\
192
+ \ maintenance and operations\n"
193
+ - 'Introducing the world of General Insurance Firm
194
+
195
+ In this project, we implemented Digital Solution and Implementation with Headless
196
+ Drupal as the CMS, and lightweight React JS (Next JS SSR on Node JS) with the
197
+ following features:
198
+
199
+ PWA & AMP based Web Pages
200
+
201
+ Page Speed Optimization
202
+
203
+ Reusable and scalable React JS / Next JS Templates and Components
204
+
205
+ Headless Drupal CMS with Content & Experience management, approval workflows,
206
+ etc for seamless collaboration between the business and marketing teams
207
+
208
+ Minimalistic Buy and Renewal Journeys for various products, with API integrations
209
+ and adherence to data compliances
210
+
211
+
212
+ We achieved 250% Reduction in Operational Time and Effort in managing the Content
213
+ & Experience for Buy & renew Journeys,220% Reduction in Customer Drops during
214
+ buy and renewal journeys, 300% Reduction in bounce rate on policy landing and
215
+ campaign pages'
216
+ - 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions
217
+ for Complex Problems and delieverd a comprehensive Website Development, Production
218
+ Support & Managed Services, we optimized customer journeys, integrate analytics,
219
+ CRM, ERP, and third-party applications, and implement cutting-edge technologies
220
+ for enhanced performance and efficiency
221
+
222
+ and achievied 200% Reduction in operational time & effort managing content & experience,
223
+ 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion
224
+ & Retention'
225
+ pipeline_tag: sentence-similarity
226
+ library_name: sentence-transformers
227
+ metrics:
228
+ - cosine_accuracy@1
229
+ - cosine_accuracy@3
230
+ - cosine_accuracy@5
231
+ - cosine_accuracy@10
232
+ - cosine_precision@1
233
+ - cosine_precision@3
234
+ - cosine_precision@5
235
+ - cosine_precision@10
236
+ - cosine_recall@1
237
+ - cosine_recall@3
238
+ - cosine_recall@5
239
+ - cosine_recall@10
240
+ - cosine_ndcg@10
241
+ - cosine_mrr@10
242
+ - cosine_map@100
243
+ model-index:
244
+ - name: BGE base Financial Matryoshka
245
+ results:
246
+ - task:
247
+ type: information-retrieval
248
+ name: Information Retrieval
249
+ dataset:
250
+ name: dim 768
251
+ type: dim_768
252
+ metrics:
253
+ - type: cosine_accuracy@1
254
+ value: 0.26666666666666666
255
+ name: Cosine Accuracy@1
256
+ - type: cosine_accuracy@3
257
+ value: 0.5466666666666666
258
+ name: Cosine Accuracy@3
259
+ - type: cosine_accuracy@5
260
+ value: 0.6533333333333333
261
+ name: Cosine Accuracy@5
262
+ - type: cosine_accuracy@10
263
+ value: 0.7333333333333333
264
+ name: Cosine Accuracy@10
265
+ - type: cosine_precision@1
266
+ value: 0.26666666666666666
267
+ name: Cosine Precision@1
268
+ - type: cosine_precision@3
269
+ value: 0.1822222222222222
270
+ name: Cosine Precision@3
271
+ - type: cosine_precision@5
272
+ value: 0.13066666666666663
273
+ name: Cosine Precision@5
274
+ - type: cosine_precision@10
275
+ value: 0.0733333333333333
276
+ name: Cosine Precision@10
277
+ - type: cosine_recall@1
278
+ value: 0.26666666666666666
279
+ name: Cosine Recall@1
280
+ - type: cosine_recall@3
281
+ value: 0.5466666666666666
282
+ name: Cosine Recall@3
283
+ - type: cosine_recall@5
284
+ value: 0.6533333333333333
285
+ name: Cosine Recall@5
286
+ - type: cosine_recall@10
287
+ value: 0.7333333333333333
288
+ name: Cosine Recall@10
289
+ - type: cosine_ndcg@10
290
+ value: 0.4946720627416652
291
+ name: Cosine Ndcg@10
292
+ - type: cosine_mrr@10
293
+ value: 0.41827513227513213
294
+ name: Cosine Mrr@10
295
+ - type: cosine_map@100
296
+ value: 0.4332969863845117
297
+ name: Cosine Map@100
298
+ - task:
299
+ type: information-retrieval
300
+ name: Information Retrieval
301
+ dataset:
302
+ name: dim 512
303
+ type: dim_512
304
+ metrics:
305
+ - type: cosine_accuracy@1
306
+ value: 0.18666666666666668
307
+ name: Cosine Accuracy@1
308
+ - type: cosine_accuracy@3
309
+ value: 0.49333333333333335
310
+ name: Cosine Accuracy@3
311
+ - type: cosine_accuracy@5
312
+ value: 0.5733333333333334
313
+ name: Cosine Accuracy@5
314
+ - type: cosine_accuracy@10
315
+ value: 0.72
316
+ name: Cosine Accuracy@10
317
+ - type: cosine_precision@1
318
+ value: 0.18666666666666668
319
+ name: Cosine Precision@1
320
+ - type: cosine_precision@3
321
+ value: 0.16444444444444442
322
+ name: Cosine Precision@3
323
+ - type: cosine_precision@5
324
+ value: 0.11466666666666664
325
+ name: Cosine Precision@5
326
+ - type: cosine_precision@10
327
+ value: 0.07199999999999998
328
+ name: Cosine Precision@10
329
+ - type: cosine_recall@1
330
+ value: 0.18666666666666668
331
+ name: Cosine Recall@1
332
+ - type: cosine_recall@3
333
+ value: 0.49333333333333335
334
+ name: Cosine Recall@3
335
+ - type: cosine_recall@5
336
+ value: 0.5733333333333334
337
+ name: Cosine Recall@5
338
+ - type: cosine_recall@10
339
+ value: 0.72
340
+ name: Cosine Recall@10
341
+ - type: cosine_ndcg@10
342
+ value: 0.4372247441588665
343
+ name: Cosine Ndcg@10
344
+ - type: cosine_mrr@10
345
+ value: 0.34789947089947076
346
+ name: Cosine Mrr@10
347
+ - type: cosine_map@100
348
+ value: 0.3616024548893477
349
+ name: Cosine Map@100
350
+ - task:
351
+ type: information-retrieval
352
+ name: Information Retrieval
353
+ dataset:
354
+ name: dim 256
355
+ type: dim_256
356
+ metrics:
357
+ - type: cosine_accuracy@1
358
+ value: 0.16
359
+ name: Cosine Accuracy@1
360
+ - type: cosine_accuracy@3
361
+ value: 0.48
362
+ name: Cosine Accuracy@3
363
+ - type: cosine_accuracy@5
364
+ value: 0.56
365
+ name: Cosine Accuracy@5
366
+ - type: cosine_accuracy@10
367
+ value: 0.7333333333333333
368
+ name: Cosine Accuracy@10
369
+ - type: cosine_precision@1
370
+ value: 0.16
371
+ name: Cosine Precision@1
372
+ - type: cosine_precision@3
373
+ value: 0.16
374
+ name: Cosine Precision@3
375
+ - type: cosine_precision@5
376
+ value: 0.11199999999999997
377
+ name: Cosine Precision@5
378
+ - type: cosine_precision@10
379
+ value: 0.07333333333333332
380
+ name: Cosine Precision@10
381
+ - type: cosine_recall@1
382
+ value: 0.16
383
+ name: Cosine Recall@1
384
+ - type: cosine_recall@3
385
+ value: 0.48
386
+ name: Cosine Recall@3
387
+ - type: cosine_recall@5
388
+ value: 0.56
389
+ name: Cosine Recall@5
390
+ - type: cosine_recall@10
391
+ value: 0.7333333333333333
392
+ name: Cosine Recall@10
393
+ - type: cosine_ndcg@10
394
+ value: 0.4259211679661321
395
+ name: Cosine Ndcg@10
396
+ - type: cosine_mrr@10
397
+ value: 0.3293015873015872
398
+ name: Cosine Mrr@10
399
+ - type: cosine_map@100
400
+ value: 0.3409713598723333
401
+ name: Cosine Map@100
402
+ - task:
403
+ type: information-retrieval
404
+ name: Information Retrieval
405
+ dataset:
406
+ name: dim 128
407
+ type: dim_128
408
+ metrics:
409
+ - type: cosine_accuracy@1
410
+ value: 0.14666666666666667
411
+ name: Cosine Accuracy@1
412
+ - type: cosine_accuracy@3
413
+ value: 0.44
414
+ name: Cosine Accuracy@3
415
+ - type: cosine_accuracy@5
416
+ value: 0.52
417
+ name: Cosine Accuracy@5
418
+ - type: cosine_accuracy@10
419
+ value: 0.6933333333333334
420
+ name: Cosine Accuracy@10
421
+ - type: cosine_precision@1
422
+ value: 0.14666666666666667
423
+ name: Cosine Precision@1
424
+ - type: cosine_precision@3
425
+ value: 0.14666666666666667
426
+ name: Cosine Precision@3
427
+ - type: cosine_precision@5
428
+ value: 0.104
429
+ name: Cosine Precision@5
430
+ - type: cosine_precision@10
431
+ value: 0.06933333333333333
432
+ name: Cosine Precision@10
433
+ - type: cosine_recall@1
434
+ value: 0.14666666666666667
435
+ name: Cosine Recall@1
436
+ - type: cosine_recall@3
437
+ value: 0.44
438
+ name: Cosine Recall@3
439
+ - type: cosine_recall@5
440
+ value: 0.52
441
+ name: Cosine Recall@5
442
+ - type: cosine_recall@10
443
+ value: 0.6933333333333334
444
+ name: Cosine Recall@10
445
+ - type: cosine_ndcg@10
446
+ value: 0.4080285837157558
447
+ name: Cosine Ndcg@10
448
+ - type: cosine_mrr@10
449
+ value: 0.3185132275132275
450
+ name: Cosine Mrr@10
451
+ - type: cosine_map@100
452
+ value: 0.33097331081482184
453
+ name: Cosine Map@100
454
+ - task:
455
+ type: information-retrieval
456
+ name: Information Retrieval
457
+ dataset:
458
+ name: dim 64
459
+ type: dim_64
460
+ metrics:
461
+ - type: cosine_accuracy@1
462
+ value: 0.09333333333333334
463
+ name: Cosine Accuracy@1
464
+ - type: cosine_accuracy@3
465
+ value: 0.37333333333333335
466
+ name: Cosine Accuracy@3
467
+ - type: cosine_accuracy@5
468
+ value: 0.4533333333333333
469
+ name: Cosine Accuracy@5
470
+ - type: cosine_accuracy@10
471
+ value: 0.6266666666666667
472
+ name: Cosine Accuracy@10
473
+ - type: cosine_precision@1
474
+ value: 0.09333333333333334
475
+ name: Cosine Precision@1
476
+ - type: cosine_precision@3
477
+ value: 0.12444444444444443
478
+ name: Cosine Precision@3
479
+ - type: cosine_precision@5
480
+ value: 0.09066666666666667
481
+ name: Cosine Precision@5
482
+ - type: cosine_precision@10
483
+ value: 0.06266666666666666
484
+ name: Cosine Precision@10
485
+ - type: cosine_recall@1
486
+ value: 0.09333333333333334
487
+ name: Cosine Recall@1
488
+ - type: cosine_recall@3
489
+ value: 0.37333333333333335
490
+ name: Cosine Recall@3
491
+ - type: cosine_recall@5
492
+ value: 0.4533333333333333
493
+ name: Cosine Recall@5
494
+ - type: cosine_recall@10
495
+ value: 0.6266666666666667
496
+ name: Cosine Recall@10
497
+ - type: cosine_ndcg@10
498
+ value: 0.33947261913722904
499
+ name: Cosine Ndcg@10
500
+ - type: cosine_mrr@10
501
+ value: 0.24945502645502649
502
+ name: Cosine Mrr@10
503
+ - type: cosine_map@100
504
+ value: 0.2624040455744732
505
+ name: Cosine Map@100
506
+ ---
507
+
508
+ # BGE base Financial Matryoshka
509
+
510
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
511
+
512
+ ## Model Details
513
+
514
+ ### Model Description
515
+ - **Model Type:** Sentence Transformer
516
+ - **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) <!-- at revision a5beb1e3e68b9ab74eb54cfd186867f64f240e1a -->
517
+ - **Maximum Sequence Length:** 512 tokens
518
+ - **Output Dimensionality:** 768 dimensions
519
+ - **Similarity Function:** Cosine Similarity
520
+ <!-- - **Training Dataset:** Unknown -->
521
+ - **Language:** en
522
+ - **License:** apache-2.0
523
+
524
+ ### Model Sources
525
+
526
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
527
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
528
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
529
+
530
+ ### Full Model Architecture
531
+
532
+ ```
533
+ SentenceTransformer(
534
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
535
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
536
+ (2): Normalize()
537
+ )
538
+ ```
539
+
540
+ ## Usage
541
+
542
+ ### Direct Usage (Sentence Transformers)
543
+
544
+ First install the Sentence Transformers library:
545
+
546
+ ```bash
547
+ pip install -U sentence-transformers
548
+ ```
549
+
550
+ Then you can load this model and run inference.
551
+ ```python
552
+ from sentence_transformers import SentenceTransformer
553
+
554
+ # Download from the 🤗 Hub
555
+ model = SentenceTransformer("Shashwat13333/bge-base-en-v1.5_v2")
556
+ # Run inference
557
+ sentences = [
558
+ 'What do you guys do for digital strategy?',
559
+ ' What we do\n\nDigital Strategy\nCreating digital frameworks that transform your digital enterprise and produce a return on investment.\n\nPlatform Selection\nHelping you select the optimal digital experience, commerce, cloud and marketing platform for your enterprise.\n\nPlatform Builds\nDeploying next-gen scalable and agile enterprise digital platforms, along with multi-platform integrations.\n\nProduct Builds\nHelp you ideate, strategize, and engineer your product with help of our enterprise frameworks \n\nTeam Augmentation\nHelp you scale up and augment your existing team to solve your hiring challenges with our easy to deploy staff augmentation offerings .\nManaged Services\nOperate and monitor your business-critical applications, data, and IT workloads, along with Application maintenance and operations\n',
560
+ 'In the Introducing the world of Global Insurance Firm, we crafted Effective Solutions for Complex Problems and delieverd a comprehensive Website Development, Production Support & Managed Services, we optimized customer journeys, integrate analytics, CRM, ERP, and third-party applications, and implement cutting-edge technologies for enhanced performance and efficiency\nand achievied 200% Reduction in operational time & effort managing content & experience, 70% Reduction in Deployment Errors and Downtime, 2.5X Customer Engagement, Conversion & Retention',
561
+ ]
562
+ embeddings = model.encode(sentences)
563
+ print(embeddings.shape)
564
+ # [3, 768]
565
+
566
+ # Get the similarity scores for the embeddings
567
+ similarities = model.similarity(embeddings, embeddings)
568
+ print(similarities.shape)
569
+ # [3, 3]
570
+ ```
571
+
572
+ <!--
573
+ ### Direct Usage (Transformers)
574
+
575
+ <details><summary>Click to see the direct usage in Transformers</summary>
576
+
577
+ </details>
578
+ -->
579
+
580
+ <!--
581
+ ### Downstream Usage (Sentence Transformers)
582
+
583
+ You can finetune this model on your own dataset.
584
+
585
+ <details><summary>Click to expand</summary>
586
+
587
+ </details>
588
+ -->
589
+
590
+ <!--
591
+ ### Out-of-Scope Use
592
+
593
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
594
+ -->
595
+
596
+ ## Evaluation
597
+
598
+ ### Metrics
599
+
600
+ #### Information Retrieval
601
+
602
+ * Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
603
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
604
+
605
+ | Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
606
+ |:--------------------|:-----------|:-----------|:-----------|:----------|:-----------|
607
+ | cosine_accuracy@1 | 0.2667 | 0.1867 | 0.16 | 0.1467 | 0.0933 |
608
+ | cosine_accuracy@3 | 0.5467 | 0.4933 | 0.48 | 0.44 | 0.3733 |
609
+ | cosine_accuracy@5 | 0.6533 | 0.5733 | 0.56 | 0.52 | 0.4533 |
610
+ | cosine_accuracy@10 | 0.7333 | 0.72 | 0.7333 | 0.6933 | 0.6267 |
611
+ | cosine_precision@1 | 0.2667 | 0.1867 | 0.16 | 0.1467 | 0.0933 |
612
+ | cosine_precision@3 | 0.1822 | 0.1644 | 0.16 | 0.1467 | 0.1244 |
613
+ | cosine_precision@5 | 0.1307 | 0.1147 | 0.112 | 0.104 | 0.0907 |
614
+ | cosine_precision@10 | 0.0733 | 0.072 | 0.0733 | 0.0693 | 0.0627 |
615
+ | cosine_recall@1 | 0.2667 | 0.1867 | 0.16 | 0.1467 | 0.0933 |
616
+ | cosine_recall@3 | 0.5467 | 0.4933 | 0.48 | 0.44 | 0.3733 |
617
+ | cosine_recall@5 | 0.6533 | 0.5733 | 0.56 | 0.52 | 0.4533 |
618
+ | cosine_recall@10 | 0.7333 | 0.72 | 0.7333 | 0.6933 | 0.6267 |
619
+ | **cosine_ndcg@10** | **0.4947** | **0.4372** | **0.4259** | **0.408** | **0.3395** |
620
+ | cosine_mrr@10 | 0.4183 | 0.3479 | 0.3293 | 0.3185 | 0.2495 |
621
+ | cosine_map@100 | 0.4333 | 0.3616 | 0.341 | 0.331 | 0.2624 |
622
+
623
+ <!--
624
+ ## Bias, Risks and Limitations
625
+
626
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
627
+ -->
628
+
629
+ <!--
630
+ ### Recommendations
631
+
632
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
633
+ -->
634
+
635
+ ## Training Details
636
+
637
+ ### Training Dataset
638
+
639
+ #### Unnamed Dataset
640
+
641
+ * Size: 150 training samples
642
+ * Columns: <code>anchor</code> and <code>positive</code>
643
+ * Approximate statistics based on the first 150 samples:
644
+ | | anchor | positive |
645
+ |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
646
+ | type | string | string |
647
+ | details | <ul><li>min: 7 tokens</li><li>mean: 11.95 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 20 tokens</li><li>mean: 125.49 tokens</li><li>max: 378 tokens</li></ul> |
648
+ * Samples:
649
+ | anchor | positive |
650
+ |:--------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
651
+ | <code>Is it hard to move old systems to the cloud?</code> | <code>We offer custom software development, digital marketing strategies, and tailored solutions to drive tangible results for your business. Our expert team combines technical prowess with industry insights to propel your business forward in the digital landscape.<br><br>"Engage, analyze & target your customers<br>Digital transformation enables you to interact with customers across multiple channels, providing personalized experiences. This could include social media engagement, interactive websites, and mobile apps." "Empower your employees & partners<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Optimize & automate your operations<br>The push for digital transformation has led many companies to embrace cloud solutions. However, the migration and integration of legacy systems into the cloud often present challenges." "Transform your products<br>The push for digi...</code> |
652
+ | <code>What benefits does marketing automation offer for time management?</code> | <code>Our MarTech capabilities<br><br>Personalization<br>Involves tailoring marketing messages and experiences to individual customers. It enhances customer engagement, loyalty, and ultimately, conversion rates.<br><br>Marketing Automation<br>Marketing automation streamlines repetitive tasks such as email marketing, lead nurturing, and social media posting. It improves efficiency, saves time, and ensures timely communication with customers.<br><br>Customer Relationship Management<br>CRM systems help manage interactions with current and potential customers. They store customer data, track interactions, and facilitate communication, improving customer retention.</code> |
653
+ | <code>How can your recommendation engines improve our business?</code> | <code>How can your recommendation engines improve our business?<br>Our recommendation engines are designed to analyze customer behavior and preferences to deliver personalized suggestions, enhancing user experience, increasing sales, and boosting customer retention.</code> |
654
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
655
+ ```json
656
+ {
657
+ "loss": "MultipleNegativesRankingLoss",
658
+ "matryoshka_dims": [
659
+ 768,
660
+ 512,
661
+ 256,
662
+ 128,
663
+ 64
664
+ ],
665
+ "matryoshka_weights": [
666
+ 1,
667
+ 1,
668
+ 1,
669
+ 1,
670
+ 1
671
+ ],
672
+ "n_dims_per_step": -1
673
+ }
674
+ ```
675
+
676
+ ### Training Hyperparameters
677
+ #### Non-Default Hyperparameters
678
+
679
+ - `eval_strategy`: epoch
680
+ - `gradient_accumulation_steps`: 4
681
+ - `learning_rate`: 1e-05
682
+ - `weight_decay`: 0.01
683
+ - `num_train_epochs`: 4
684
+ - `lr_scheduler_type`: cosine
685
+ - `warmup_ratio`: 0.1
686
+ - `fp16`: True
687
+ - `load_best_model_at_end`: True
688
+ - `optim`: adamw_torch_fused
689
+ - `push_to_hub`: True
690
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v2
691
+ - `push_to_hub_model_id`: bge-base-en-v1.5_v2
692
+ - `batch_sampler`: no_duplicates
693
+
694
+ #### All Hyperparameters
695
+ <details><summary>Click to expand</summary>
696
+
697
+ - `overwrite_output_dir`: False
698
+ - `do_predict`: False
699
+ - `eval_strategy`: epoch
700
+ - `prediction_loss_only`: True
701
+ - `per_device_train_batch_size`: 8
702
+ - `per_device_eval_batch_size`: 8
703
+ - `per_gpu_train_batch_size`: None
704
+ - `per_gpu_eval_batch_size`: None
705
+ - `gradient_accumulation_steps`: 4
706
+ - `eval_accumulation_steps`: None
707
+ - `torch_empty_cache_steps`: None
708
+ - `learning_rate`: 1e-05
709
+ - `weight_decay`: 0.01
710
+ - `adam_beta1`: 0.9
711
+ - `adam_beta2`: 0.999
712
+ - `adam_epsilon`: 1e-08
713
+ - `max_grad_norm`: 1.0
714
+ - `num_train_epochs`: 4
715
+ - `max_steps`: -1
716
+ - `lr_scheduler_type`: cosine
717
+ - `lr_scheduler_kwargs`: {}
718
+ - `warmup_ratio`: 0.1
719
+ - `warmup_steps`: 0
720
+ - `log_level`: passive
721
+ - `log_level_replica`: warning
722
+ - `log_on_each_node`: True
723
+ - `logging_nan_inf_filter`: True
724
+ - `save_safetensors`: True
725
+ - `save_on_each_node`: False
726
+ - `save_only_model`: False
727
+ - `restore_callback_states_from_checkpoint`: False
728
+ - `no_cuda`: False
729
+ - `use_cpu`: False
730
+ - `use_mps_device`: False
731
+ - `seed`: 42
732
+ - `data_seed`: None
733
+ - `jit_mode_eval`: False
734
+ - `use_ipex`: False
735
+ - `bf16`: False
736
+ - `fp16`: True
737
+ - `fp16_opt_level`: O1
738
+ - `half_precision_backend`: auto
739
+ - `bf16_full_eval`: False
740
+ - `fp16_full_eval`: False
741
+ - `tf32`: None
742
+ - `local_rank`: 0
743
+ - `ddp_backend`: None
744
+ - `tpu_num_cores`: None
745
+ - `tpu_metrics_debug`: False
746
+ - `debug`: []
747
+ - `dataloader_drop_last`: False
748
+ - `dataloader_num_workers`: 0
749
+ - `dataloader_prefetch_factor`: None
750
+ - `past_index`: -1
751
+ - `disable_tqdm`: False
752
+ - `remove_unused_columns`: True
753
+ - `label_names`: None
754
+ - `load_best_model_at_end`: True
755
+ - `ignore_data_skip`: False
756
+ - `fsdp`: []
757
+ - `fsdp_min_num_params`: 0
758
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
759
+ - `fsdp_transformer_layer_cls_to_wrap`: None
760
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
761
+ - `deepspeed`: None
762
+ - `label_smoothing_factor`: 0.0
763
+ - `optim`: adamw_torch_fused
764
+ - `optim_args`: None
765
+ - `adafactor`: False
766
+ - `group_by_length`: False
767
+ - `length_column_name`: length
768
+ - `ddp_find_unused_parameters`: None
769
+ - `ddp_bucket_cap_mb`: None
770
+ - `ddp_broadcast_buffers`: False
771
+ - `dataloader_pin_memory`: True
772
+ - `dataloader_persistent_workers`: False
773
+ - `skip_memory_metrics`: True
774
+ - `use_legacy_prediction_loop`: False
775
+ - `push_to_hub`: True
776
+ - `resume_from_checkpoint`: None
777
+ - `hub_model_id`: Shashwat13333/bge-base-en-v1.5_v2
778
+ - `hub_strategy`: every_save
779
+ - `hub_private_repo`: None
780
+ - `hub_always_push`: False
781
+ - `gradient_checkpointing`: False
782
+ - `gradient_checkpointing_kwargs`: None
783
+ - `include_inputs_for_metrics`: False
784
+ - `include_for_metrics`: []
785
+ - `eval_do_concat_batches`: True
786
+ - `fp16_backend`: auto
787
+ - `push_to_hub_model_id`: bge-base-en-v1.5_v2
788
+ - `push_to_hub_organization`: None
789
+ - `mp_parameters`:
790
+ - `auto_find_batch_size`: False
791
+ - `full_determinism`: False
792
+ - `torchdynamo`: None
793
+ - `ray_scope`: last
794
+ - `ddp_timeout`: 1800
795
+ - `torch_compile`: False
796
+ - `torch_compile_backend`: None
797
+ - `torch_compile_mode`: None
798
+ - `dispatch_batches`: None
799
+ - `split_batches`: None
800
+ - `include_tokens_per_second`: False
801
+ - `include_num_input_tokens_seen`: False
802
+ - `neftune_noise_alpha`: None
803
+ - `optim_target_modules`: None
804
+ - `batch_eval_metrics`: False
805
+ - `eval_on_start`: False
806
+ - `use_liger_kernel`: False
807
+ - `eval_use_gather_object`: False
808
+ - `average_tokens_across_devices`: False
809
+ - `prompts`: None
810
+ - `batch_sampler`: no_duplicates
811
+ - `multi_dataset_batch_sampler`: proportional
812
+
813
+ </details>
814
+
815
+ ### Training Logs
816
+ | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_256_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
817
+ |:----------:|:------:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
818
+ | 0.2105 | 1 | 23.6527 | - | - | - | - | - |
819
+ | 0.8421 | 4 | - | 0.4448 | 0.4228 | 0.4048 | 0.3530 | 0.3089 |
820
+ | 1.2105 | 5 | 18.7138 | - | - | - | - | - |
821
+ | 1.8421 | 8 | - | 0.4823 | 0.4710 | 0.4128 | 0.3744 | 0.3462 |
822
+ | 2.4211 | 10 | 15.363 | - | - | - | - | - |
823
+ | **2.8421** | **12** | **-** | **0.4836** | **0.4258** | **0.4144** | **0.4182** | **0.3319** |
824
+ | 3.6316 | 15 | 14.0898 | - | - | - | - | - |
825
+ | 3.8421 | 16 | - | 0.4947 | 0.4372 | 0.4259 | 0.4080 | 0.3395 |
826
+
827
+ * The bold row denotes the saved checkpoint.
828
+
829
+ ### Framework Versions
830
+ - Python: 3.11.11
831
+ - Sentence Transformers: 3.4.1
832
+ - Transformers: 4.48.2
833
+ - PyTorch: 2.5.1+cu124
834
+ - Accelerate: 1.3.0
835
+ - Datasets: 3.2.0
836
+ - Tokenizers: 0.21.0
837
+
838
+ ## Citation
839
+
840
+ ### BibTeX
841
+
842
+ #### Sentence Transformers
843
+ ```bibtex
844
+ @inproceedings{reimers-2019-sentence-bert,
845
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
846
+ author = "Reimers, Nils and Gurevych, Iryna",
847
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
848
+ month = "11",
849
+ year = "2019",
850
+ publisher = "Association for Computational Linguistics",
851
+ url = "https://arxiv.org/abs/1908.10084",
852
+ }
853
+ ```
854
+
855
+ #### MatryoshkaLoss
856
+ ```bibtex
857
+ @misc{kusupati2024matryoshka,
858
+ title={Matryoshka Representation Learning},
859
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
860
+ year={2024},
861
+ eprint={2205.13147},
862
+ archivePrefix={arXiv},
863
+ primaryClass={cs.LG}
864
+ }
865
+ ```
866
+
867
+ #### MultipleNegativesRankingLoss
868
+ ```bibtex
869
+ @misc{henderson2017efficient,
870
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
871
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
872
+ year={2017},
873
+ eprint={1705.00652},
874
+ archivePrefix={arXiv},
875
+ primaryClass={cs.CL}
876
+ }
877
+ ```
878
+
879
+ <!--
880
+ ## Glossary
881
+
882
+ *Clearly define terms in order to be accessible across audiences.*
883
+ -->
884
+
885
+ <!--
886
+ ## Model Card Authors
887
+
888
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
889
+ -->
890
+
891
+ <!--
892
+ ## Model Card Contact
893
+
894
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
895
+ -->
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.1",
4
+ "transformers": "4.48.2",
5
+ "pytorch": "2.5.1+cu124"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ee67f27303cefc91334bb6b029e2a4ff5efc8d80d4368233e3e7be1884f797f9
3
  size 437951328
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d0b76a5575e56f18c3c8f2539051fb974daa23ebd23157374f7ed077527e782a
3
  size 437951328
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": true
4
+ }